Sep 16 04:23:59.143133 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Sep 16 04:23:59.143177 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 16 03:05:48 -00 2025 Sep 16 04:23:59.143200 kernel: KASLR disabled due to lack of seed Sep 16 04:23:59.143216 kernel: efi: EFI v2.7 by EDK II Sep 16 04:23:59.143232 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a731a98 MEMRESERVE=0x78557598 Sep 16 04:23:59.143247 kernel: secureboot: Secure boot disabled Sep 16 04:23:59.143264 kernel: ACPI: Early table checksum verification disabled Sep 16 04:23:59.143279 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Sep 16 04:23:59.143294 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Sep 16 04:23:59.143309 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 16 04:23:59.143324 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Sep 16 04:23:59.143343 kernel: ACPI: FACS 0x0000000078630000 000040 Sep 16 04:23:59.143358 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 16 04:23:59.143373 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Sep 16 04:23:59.143391 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Sep 16 04:23:59.143407 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Sep 16 04:23:59.143427 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 16 04:23:59.143443 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Sep 16 04:23:59.143459 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Sep 16 04:23:59.143475 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Sep 16 04:23:59.143491 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Sep 16 04:23:59.145481 kernel: printk: legacy bootconsole [uart0] enabled Sep 16 04:23:59.145523 kernel: ACPI: Use ACPI SPCR as default console: No Sep 16 04:23:59.145571 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Sep 16 04:23:59.145588 kernel: NODE_DATA(0) allocated [mem 0x4b584ca00-0x4b5853fff] Sep 16 04:23:59.145605 kernel: Zone ranges: Sep 16 04:23:59.145622 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 16 04:23:59.145647 kernel: DMA32 empty Sep 16 04:23:59.145663 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Sep 16 04:23:59.145678 kernel: Device empty Sep 16 04:23:59.145694 kernel: Movable zone start for each node Sep 16 04:23:59.145710 kernel: Early memory node ranges Sep 16 04:23:59.145727 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Sep 16 04:23:59.145743 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Sep 16 04:23:59.145759 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Sep 16 04:23:59.145775 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Sep 16 04:23:59.145791 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Sep 16 04:23:59.145807 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Sep 16 04:23:59.145824 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Sep 16 04:23:59.145846 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Sep 16 04:23:59.145870 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Sep 16 04:23:59.145887 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Sep 16 04:23:59.145904 kernel: cma: Reserved 16 MiB at 0x000000007f000000 on node -1 Sep 16 04:23:59.145920 kernel: psci: probing for conduit method from ACPI. Sep 16 04:23:59.145942 kernel: psci: PSCIv1.0 detected in firmware. Sep 16 04:23:59.145959 kernel: psci: Using standard PSCI v0.2 function IDs Sep 16 04:23:59.145975 kernel: psci: Trusted OS migration not required Sep 16 04:23:59.145992 kernel: psci: SMC Calling Convention v1.1 Sep 16 04:23:59.146009 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Sep 16 04:23:59.146025 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 16 04:23:59.146042 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 16 04:23:59.146059 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 16 04:23:59.146076 kernel: Detected PIPT I-cache on CPU0 Sep 16 04:23:59.146092 kernel: CPU features: detected: GIC system register CPU interface Sep 16 04:23:59.146109 kernel: CPU features: detected: Spectre-v2 Sep 16 04:23:59.146130 kernel: CPU features: detected: Spectre-v3a Sep 16 04:23:59.146147 kernel: CPU features: detected: Spectre-BHB Sep 16 04:23:59.146163 kernel: CPU features: detected: ARM erratum 1742098 Sep 16 04:23:59.146179 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Sep 16 04:23:59.146196 kernel: alternatives: applying boot alternatives Sep 16 04:23:59.146214 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:23:59.146232 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 04:23:59.146249 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 16 04:23:59.146266 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:23:59.146282 kernel: Fallback order for Node 0: 0 Sep 16 04:23:59.146303 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1007616 Sep 16 04:23:59.146320 kernel: Policy zone: Normal Sep 16 04:23:59.146337 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 04:23:59.146354 kernel: software IO TLB: area num 2. Sep 16 04:23:59.146371 kernel: software IO TLB: mapped [mem 0x000000006c5f0000-0x00000000705f0000] (64MB) Sep 16 04:23:59.146389 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 16 04:23:59.146408 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 04:23:59.146429 kernel: rcu: RCU event tracing is enabled. Sep 16 04:23:59.146446 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 16 04:23:59.146464 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 04:23:59.146481 kernel: Tracing variant of Tasks RCU enabled. Sep 16 04:23:59.147551 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 04:23:59.147597 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 16 04:23:59.147616 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:23:59.147633 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:23:59.147650 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 16 04:23:59.147668 kernel: GICv3: 96 SPIs implemented Sep 16 04:23:59.147684 kernel: GICv3: 0 Extended SPIs implemented Sep 16 04:23:59.147701 kernel: Root IRQ handler: gic_handle_irq Sep 16 04:23:59.147718 kernel: GICv3: GICv3 features: 16 PPIs Sep 16 04:23:59.147735 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 16 04:23:59.147752 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Sep 16 04:23:59.147768 kernel: ITS [mem 0x10080000-0x1009ffff] Sep 16 04:23:59.147785 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000f0000 (indirect, esz 8, psz 64K, shr 1) Sep 16 04:23:59.147808 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @400100000 (flat, esz 8, psz 64K, shr 1) Sep 16 04:23:59.147825 kernel: GICv3: using LPI property table @0x0000000400110000 Sep 16 04:23:59.147841 kernel: ITS: Using hypervisor restricted LPI range [128] Sep 16 04:23:59.147858 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000400120000 Sep 16 04:23:59.147875 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 04:23:59.147891 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Sep 16 04:23:59.147908 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Sep 16 04:23:59.147925 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Sep 16 04:23:59.147942 kernel: Console: colour dummy device 80x25 Sep 16 04:23:59.147960 kernel: printk: legacy console [tty1] enabled Sep 16 04:23:59.147977 kernel: ACPI: Core revision 20240827 Sep 16 04:23:59.148000 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Sep 16 04:23:59.148017 kernel: pid_max: default: 32768 minimum: 301 Sep 16 04:23:59.148034 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 04:23:59.148051 kernel: landlock: Up and running. Sep 16 04:23:59.148068 kernel: SELinux: Initializing. Sep 16 04:23:59.148087 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:23:59.148105 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:23:59.148122 kernel: rcu: Hierarchical SRCU implementation. Sep 16 04:23:59.148141 kernel: rcu: Max phase no-delay instances is 400. Sep 16 04:23:59.148164 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 16 04:23:59.148182 kernel: Remapping and enabling EFI services. Sep 16 04:23:59.148201 kernel: smp: Bringing up secondary CPUs ... Sep 16 04:23:59.148218 kernel: Detected PIPT I-cache on CPU1 Sep 16 04:23:59.148235 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Sep 16 04:23:59.148252 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000400130000 Sep 16 04:23:59.148270 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Sep 16 04:23:59.148287 kernel: smp: Brought up 1 node, 2 CPUs Sep 16 04:23:59.148304 kernel: SMP: Total of 2 processors activated. Sep 16 04:23:59.148336 kernel: CPU: All CPU(s) started at EL1 Sep 16 04:23:59.148354 kernel: CPU features: detected: 32-bit EL0 Support Sep 16 04:23:59.148377 kernel: CPU features: detected: 32-bit EL1 Support Sep 16 04:23:59.148395 kernel: CPU features: detected: CRC32 instructions Sep 16 04:23:59.148412 kernel: alternatives: applying system-wide alternatives Sep 16 04:23:59.148432 kernel: Memory: 3797032K/4030464K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 212088K reserved, 16384K cma-reserved) Sep 16 04:23:59.148450 kernel: devtmpfs: initialized Sep 16 04:23:59.148472 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 04:23:59.148491 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 16 04:23:59.149573 kernel: 17040 pages in range for non-PLT usage Sep 16 04:23:59.149607 kernel: 508560 pages in range for PLT usage Sep 16 04:23:59.149627 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 04:23:59.149645 kernel: SMBIOS 3.0.0 present. Sep 16 04:23:59.149663 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Sep 16 04:23:59.149680 kernel: DMI: Memory slots populated: 0/0 Sep 16 04:23:59.149698 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 04:23:59.149725 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 16 04:23:59.149743 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 16 04:23:59.149761 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 16 04:23:59.149779 kernel: audit: initializing netlink subsys (disabled) Sep 16 04:23:59.149798 kernel: audit: type=2000 audit(0.226:1): state=initialized audit_enabled=0 res=1 Sep 16 04:23:59.149818 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 04:23:59.149836 kernel: cpuidle: using governor menu Sep 16 04:23:59.149855 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 16 04:23:59.149873 kernel: ASID allocator initialised with 65536 entries Sep 16 04:23:59.149895 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 04:23:59.149913 kernel: Serial: AMBA PL011 UART driver Sep 16 04:23:59.149931 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 04:23:59.149948 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 04:23:59.149966 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 16 04:23:59.149984 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 16 04:23:59.150001 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 04:23:59.150019 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 04:23:59.150037 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 16 04:23:59.150059 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 16 04:23:59.150077 kernel: ACPI: Added _OSI(Module Device) Sep 16 04:23:59.150094 kernel: ACPI: Added _OSI(Processor Device) Sep 16 04:23:59.150112 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 04:23:59.150129 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 16 04:23:59.150147 kernel: ACPI: Interpreter enabled Sep 16 04:23:59.150165 kernel: ACPI: Using GIC for interrupt routing Sep 16 04:23:59.150182 kernel: ACPI: MCFG table detected, 1 entries Sep 16 04:23:59.150200 kernel: ACPI: CPU0 has been hot-added Sep 16 04:23:59.150221 kernel: ACPI: CPU1 has been hot-added Sep 16 04:23:59.150239 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Sep 16 04:23:59.151560 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 04:23:59.151784 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 16 04:23:59.151967 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 16 04:23:59.152152 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Sep 16 04:23:59.152337 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Sep 16 04:23:59.152371 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Sep 16 04:23:59.152392 kernel: acpiphp: Slot [1] registered Sep 16 04:23:59.152410 kernel: acpiphp: Slot [2] registered Sep 16 04:23:59.152428 kernel: acpiphp: Slot [3] registered Sep 16 04:23:59.152446 kernel: acpiphp: Slot [4] registered Sep 16 04:23:59.152463 kernel: acpiphp: Slot [5] registered Sep 16 04:23:59.152481 kernel: acpiphp: Slot [6] registered Sep 16 04:23:59.152757 kernel: acpiphp: Slot [7] registered Sep 16 04:23:59.152782 kernel: acpiphp: Slot [8] registered Sep 16 04:23:59.152800 kernel: acpiphp: Slot [9] registered Sep 16 04:23:59.152826 kernel: acpiphp: Slot [10] registered Sep 16 04:23:59.152844 kernel: acpiphp: Slot [11] registered Sep 16 04:23:59.152861 kernel: acpiphp: Slot [12] registered Sep 16 04:23:59.152879 kernel: acpiphp: Slot [13] registered Sep 16 04:23:59.152896 kernel: acpiphp: Slot [14] registered Sep 16 04:23:59.152914 kernel: acpiphp: Slot [15] registered Sep 16 04:23:59.152932 kernel: acpiphp: Slot [16] registered Sep 16 04:23:59.152950 kernel: acpiphp: Slot [17] registered Sep 16 04:23:59.152967 kernel: acpiphp: Slot [18] registered Sep 16 04:23:59.152989 kernel: acpiphp: Slot [19] registered Sep 16 04:23:59.153007 kernel: acpiphp: Slot [20] registered Sep 16 04:23:59.153025 kernel: acpiphp: Slot [21] registered Sep 16 04:23:59.153042 kernel: acpiphp: Slot [22] registered Sep 16 04:23:59.153060 kernel: acpiphp: Slot [23] registered Sep 16 04:23:59.153078 kernel: acpiphp: Slot [24] registered Sep 16 04:23:59.153095 kernel: acpiphp: Slot [25] registered Sep 16 04:23:59.153113 kernel: acpiphp: Slot [26] registered Sep 16 04:23:59.153131 kernel: acpiphp: Slot [27] registered Sep 16 04:23:59.153148 kernel: acpiphp: Slot [28] registered Sep 16 04:23:59.153170 kernel: acpiphp: Slot [29] registered Sep 16 04:23:59.153188 kernel: acpiphp: Slot [30] registered Sep 16 04:23:59.153205 kernel: acpiphp: Slot [31] registered Sep 16 04:23:59.153223 kernel: PCI host bridge to bus 0000:00 Sep 16 04:23:59.153460 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Sep 16 04:23:59.155216 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 16 04:23:59.155401 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Sep 16 04:23:59.155596 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Sep 16 04:23:59.155835 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 conventional PCI endpoint Sep 16 04:23:59.156047 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 conventional PCI endpoint Sep 16 04:23:59.156236 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff] Sep 16 04:23:59.156435 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Root Complex Integrated Endpoint Sep 16 04:23:59.157701 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80114000-0x80117fff] Sep 16 04:23:59.157914 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 04:23:59.158130 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Root Complex Integrated Endpoint Sep 16 04:23:59.158318 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80110000-0x80113fff] Sep 16 04:23:59.158599 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref] Sep 16 04:23:59.158838 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff] Sep 16 04:23:59.159071 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 16 04:23:59.159266 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref]: assigned Sep 16 04:23:59.159449 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff]: assigned Sep 16 04:23:59.159692 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80110000-0x80113fff]: assigned Sep 16 04:23:59.159880 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80114000-0x80117fff]: assigned Sep 16 04:23:59.160079 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff]: assigned Sep 16 04:23:59.160253 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Sep 16 04:23:59.160419 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 16 04:23:59.160614 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Sep 16 04:23:59.160640 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 16 04:23:59.160668 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 16 04:23:59.160686 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 16 04:23:59.160704 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 16 04:23:59.160722 kernel: iommu: Default domain type: Translated Sep 16 04:23:59.160740 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 16 04:23:59.160757 kernel: efivars: Registered efivars operations Sep 16 04:23:59.160775 kernel: vgaarb: loaded Sep 16 04:23:59.160794 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 16 04:23:59.160811 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 04:23:59.160834 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 04:23:59.160852 kernel: pnp: PnP ACPI init Sep 16 04:23:59.161055 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Sep 16 04:23:59.161081 kernel: pnp: PnP ACPI: found 1 devices Sep 16 04:23:59.161099 kernel: NET: Registered PF_INET protocol family Sep 16 04:23:59.161118 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 04:23:59.161136 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 16 04:23:59.161154 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 04:23:59.161177 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 16 04:23:59.161195 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 16 04:23:59.161213 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 16 04:23:59.161232 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:23:59.161250 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:23:59.161267 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 04:23:59.161285 kernel: PCI: CLS 0 bytes, default 64 Sep 16 04:23:59.161303 kernel: kvm [1]: HYP mode not available Sep 16 04:23:59.161320 kernel: Initialise system trusted keyrings Sep 16 04:23:59.161343 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 16 04:23:59.161361 kernel: Key type asymmetric registered Sep 16 04:23:59.161379 kernel: Asymmetric key parser 'x509' registered Sep 16 04:23:59.161397 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 16 04:23:59.161414 kernel: io scheduler mq-deadline registered Sep 16 04:23:59.161432 kernel: io scheduler kyber registered Sep 16 04:23:59.161450 kernel: io scheduler bfq registered Sep 16 04:23:59.161664 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Sep 16 04:23:59.161696 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 16 04:23:59.161715 kernel: ACPI: button: Power Button [PWRB] Sep 16 04:23:59.161733 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Sep 16 04:23:59.161750 kernel: ACPI: button: Sleep Button [SLPB] Sep 16 04:23:59.161768 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 04:23:59.161787 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 16 04:23:59.162042 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Sep 16 04:23:59.162070 kernel: printk: legacy console [ttyS0] disabled Sep 16 04:23:59.162089 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Sep 16 04:23:59.162114 kernel: printk: legacy console [ttyS0] enabled Sep 16 04:23:59.162133 kernel: printk: legacy bootconsole [uart0] disabled Sep 16 04:23:59.162150 kernel: thunder_xcv, ver 1.0 Sep 16 04:23:59.162168 kernel: thunder_bgx, ver 1.0 Sep 16 04:23:59.162186 kernel: nicpf, ver 1.0 Sep 16 04:23:59.162203 kernel: nicvf, ver 1.0 Sep 16 04:23:59.162408 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 16 04:23:59.162713 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-16T04:23:58 UTC (1757996638) Sep 16 04:23:59.162751 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 04:23:59.162770 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 (0,80000003) counters available Sep 16 04:23:59.162790 kernel: NET: Registered PF_INET6 protocol family Sep 16 04:23:59.162808 kernel: watchdog: NMI not fully supported Sep 16 04:23:59.162826 kernel: Segment Routing with IPv6 Sep 16 04:23:59.162845 kernel: watchdog: Hard watchdog permanently disabled Sep 16 04:23:59.162865 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 04:23:59.162883 kernel: NET: Registered PF_PACKET protocol family Sep 16 04:23:59.162901 kernel: Key type dns_resolver registered Sep 16 04:23:59.162925 kernel: registered taskstats version 1 Sep 16 04:23:59.162943 kernel: Loading compiled-in X.509 certificates Sep 16 04:23:59.162961 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 99eb88579c3d58869b2224a85ec8efa5647af805' Sep 16 04:23:59.162979 kernel: Demotion targets for Node 0: null Sep 16 04:23:59.162999 kernel: Key type .fscrypt registered Sep 16 04:23:59.163018 kernel: Key type fscrypt-provisioning registered Sep 16 04:23:59.163035 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 16 04:23:59.163053 kernel: ima: Allocated hash algorithm: sha1 Sep 16 04:23:59.163071 kernel: ima: No architecture policies found Sep 16 04:23:59.163093 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 16 04:23:59.163112 kernel: clk: Disabling unused clocks Sep 16 04:23:59.163129 kernel: PM: genpd: Disabling unused power domains Sep 16 04:23:59.163147 kernel: Warning: unable to open an initial console. Sep 16 04:23:59.163165 kernel: Freeing unused kernel memory: 38976K Sep 16 04:23:59.163183 kernel: Run /init as init process Sep 16 04:23:59.163201 kernel: with arguments: Sep 16 04:23:59.163219 kernel: /init Sep 16 04:23:59.163236 kernel: with environment: Sep 16 04:23:59.163254 kernel: HOME=/ Sep 16 04:23:59.163277 kernel: TERM=linux Sep 16 04:23:59.163294 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 04:23:59.163315 systemd[1]: Successfully made /usr/ read-only. Sep 16 04:23:59.163339 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:23:59.163360 systemd[1]: Detected virtualization amazon. Sep 16 04:23:59.163379 systemd[1]: Detected architecture arm64. Sep 16 04:23:59.163397 systemd[1]: Running in initrd. Sep 16 04:23:59.163420 systemd[1]: No hostname configured, using default hostname. Sep 16 04:23:59.163441 systemd[1]: Hostname set to . Sep 16 04:23:59.163460 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:23:59.163479 systemd[1]: Queued start job for default target initrd.target. Sep 16 04:23:59.163575 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:23:59.163602 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:23:59.163624 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 04:23:59.163645 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:23:59.163672 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 04:23:59.163694 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 04:23:59.163716 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 04:23:59.163759 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 04:23:59.167001 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:23:59.167024 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:23:59.167044 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:23:59.167075 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:23:59.167095 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:23:59.167114 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:23:59.167134 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:23:59.167154 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:23:59.167174 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 04:23:59.167193 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 04:23:59.167213 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:23:59.167236 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:23:59.167256 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:23:59.167276 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:23:59.167295 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 04:23:59.167315 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:23:59.167334 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 04:23:59.167354 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 04:23:59.167374 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 04:23:59.167393 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:23:59.167418 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:23:59.167437 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:23:59.167457 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 04:23:59.167477 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:23:59.167524 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 04:23:59.167547 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:23:59.167611 systemd-journald[257]: Collecting audit messages is disabled. Sep 16 04:23:59.167654 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 04:23:59.167680 kernel: Bridge firewalling registered Sep 16 04:23:59.167715 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:23:59.167736 systemd-journald[257]: Journal started Sep 16 04:23:59.167773 systemd-journald[257]: Runtime Journal (/run/log/journal/ec29882b8284d0d8a36e305c4998c043) is 8M, max 75.3M, 67.3M free. Sep 16 04:23:59.123358 systemd-modules-load[258]: Inserted module 'overlay' Sep 16 04:23:59.162935 systemd-modules-load[258]: Inserted module 'br_netfilter' Sep 16 04:23:59.174536 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:23:59.181338 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:23:59.182128 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:23:59.194171 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 04:23:59.195880 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:23:59.206608 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:23:59.222882 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:23:59.250130 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:23:59.261041 systemd-tmpfiles[279]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 04:23:59.269954 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:23:59.275395 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:23:59.290938 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:23:59.309111 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:23:59.317855 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 04:23:59.365675 dracut-cmdline[299]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:23:59.398226 systemd-resolved[293]: Positive Trust Anchors: Sep 16 04:23:59.398263 systemd-resolved[293]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:23:59.398325 systemd-resolved[293]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:23:59.520532 kernel: SCSI subsystem initialized Sep 16 04:23:59.530529 kernel: Loading iSCSI transport class v2.0-870. Sep 16 04:23:59.541540 kernel: iscsi: registered transport (tcp) Sep 16 04:23:59.563125 kernel: iscsi: registered transport (qla4xxx) Sep 16 04:23:59.563199 kernel: QLogic iSCSI HBA Driver Sep 16 04:23:59.594347 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:23:59.627408 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:23:59.639423 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:23:59.684537 kernel: random: crng init done Sep 16 04:23:59.684917 systemd-resolved[293]: Defaulting to hostname 'linux'. Sep 16 04:23:59.688432 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:23:59.688739 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:23:59.746589 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 04:23:59.753905 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 04:23:59.838588 kernel: raid6: neonx8 gen() 6568 MB/s Sep 16 04:23:59.855547 kernel: raid6: neonx4 gen() 6578 MB/s Sep 16 04:23:59.872549 kernel: raid6: neonx2 gen() 5434 MB/s Sep 16 04:23:59.889553 kernel: raid6: neonx1 gen() 3953 MB/s Sep 16 04:23:59.906555 kernel: raid6: int64x8 gen() 3662 MB/s Sep 16 04:23:59.923550 kernel: raid6: int64x4 gen() 3701 MB/s Sep 16 04:23:59.940542 kernel: raid6: int64x2 gen() 3594 MB/s Sep 16 04:23:59.958513 kernel: raid6: int64x1 gen() 2764 MB/s Sep 16 04:23:59.958588 kernel: raid6: using algorithm neonx4 gen() 6578 MB/s Sep 16 04:23:59.976550 kernel: raid6: .... xor() 4864 MB/s, rmw enabled Sep 16 04:23:59.976621 kernel: raid6: using neon recovery algorithm Sep 16 04:23:59.985294 kernel: xor: measuring software checksum speed Sep 16 04:23:59.985362 kernel: 8regs : 12928 MB/sec Sep 16 04:23:59.986532 kernel: 32regs : 12062 MB/sec Sep 16 04:23:59.988757 kernel: arm64_neon : 8247 MB/sec Sep 16 04:23:59.988795 kernel: xor: using function: 8regs (12928 MB/sec) Sep 16 04:24:00.083545 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 04:24:00.095079 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:24:00.105483 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:24:00.150580 systemd-udevd[507]: Using default interface naming scheme 'v255'. Sep 16 04:24:00.161253 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:24:00.173947 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 04:24:00.219758 dracut-pre-trigger[516]: rd.md=0: removing MD RAID activation Sep 16 04:24:00.267797 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:24:00.273410 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:24:00.409784 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:24:00.427491 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 04:24:00.589426 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 16 04:24:00.590548 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Sep 16 04:24:00.602551 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 16 04:24:00.602775 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:24:00.615955 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 16 04:24:00.616261 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 16 04:24:00.616552 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 16 04:24:00.603028 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:24:00.628118 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 16 04:24:00.628387 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:61:b0:0b:91:e9 Sep 16 04:24:00.616307 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:24:00.635642 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 04:24:00.635689 kernel: GPT:9289727 != 16777215 Sep 16 04:24:00.635714 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 04:24:00.624742 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:24:00.648618 kernel: GPT:9289727 != 16777215 Sep 16 04:24:00.648653 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 04:24:00.648678 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 16 04:24:00.648408 (udev-worker)[564]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:24:00.662243 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:24:00.695572 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:24:00.707536 kernel: nvme nvme0: using unchecked data buffer Sep 16 04:24:00.888064 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 16 04:24:00.910036 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 04:24:00.938438 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 16 04:24:00.966556 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 16 04:24:00.987858 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 16 04:24:00.993879 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 16 04:24:00.997253 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:24:01.007808 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:24:01.010714 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:24:01.023742 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 04:24:01.030969 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 04:24:01.050872 disk-uuid[687]: Primary Header is updated. Sep 16 04:24:01.050872 disk-uuid[687]: Secondary Entries is updated. Sep 16 04:24:01.050872 disk-uuid[687]: Secondary Header is updated. Sep 16 04:24:01.066566 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 16 04:24:01.081603 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 16 04:24:01.081309 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:24:02.083618 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 16 04:24:02.084564 disk-uuid[688]: The operation has completed successfully. Sep 16 04:24:02.286019 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 04:24:02.286825 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 04:24:02.400073 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 04:24:02.439660 sh[953]: Success Sep 16 04:24:02.468203 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 04:24:02.468287 kernel: device-mapper: uevent: version 1.0.3 Sep 16 04:24:02.470780 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 04:24:02.483554 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 16 04:24:02.584219 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 04:24:02.594824 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 04:24:02.614793 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 04:24:02.641541 kernel: BTRFS: device fsid 782b6948-7aaa-439e-9946-c8fdb4d8f287 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (976) Sep 16 04:24:02.645275 kernel: BTRFS info (device dm-0): first mount of filesystem 782b6948-7aaa-439e-9946-c8fdb4d8f287 Sep 16 04:24:02.645421 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:24:02.736133 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 16 04:24:02.736225 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 04:24:02.737842 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 04:24:02.750035 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 04:24:02.755371 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:24:02.761268 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 04:24:02.762823 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 04:24:02.775615 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 04:24:02.834580 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1003) Sep 16 04:24:02.839716 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:24:02.839831 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:24:02.857335 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 16 04:24:02.857415 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 16 04:24:02.866607 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:24:02.867643 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 04:24:02.879060 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 04:24:02.991610 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:24:03.012783 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:24:03.080194 systemd-networkd[1147]: lo: Link UP Sep 16 04:24:03.080703 systemd-networkd[1147]: lo: Gained carrier Sep 16 04:24:03.083609 systemd-networkd[1147]: Enumeration completed Sep 16 04:24:03.085811 systemd-networkd[1147]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:24:03.085820 systemd-networkd[1147]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:24:03.086729 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:24:03.093486 systemd[1]: Reached target network.target - Network. Sep 16 04:24:03.112410 systemd-networkd[1147]: eth0: Link UP Sep 16 04:24:03.112423 systemd-networkd[1147]: eth0: Gained carrier Sep 16 04:24:03.112446 systemd-networkd[1147]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:24:03.138625 systemd-networkd[1147]: eth0: DHCPv4 address 172.31.31.172/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 16 04:24:03.471017 ignition[1063]: Ignition 2.22.0 Sep 16 04:24:03.471588 ignition[1063]: Stage: fetch-offline Sep 16 04:24:03.472540 ignition[1063]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:03.472565 ignition[1063]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:03.474149 ignition[1063]: Ignition finished successfully Sep 16 04:24:03.485166 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:24:03.490857 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 16 04:24:03.548424 ignition[1158]: Ignition 2.22.0 Sep 16 04:24:03.549077 ignition[1158]: Stage: fetch Sep 16 04:24:03.550838 ignition[1158]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:03.550868 ignition[1158]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:03.551428 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:03.569990 ignition[1158]: PUT result: OK Sep 16 04:24:03.575297 ignition[1158]: parsed url from cmdline: "" Sep 16 04:24:03.575315 ignition[1158]: no config URL provided Sep 16 04:24:03.575331 ignition[1158]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:24:03.575360 ignition[1158]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:24:03.575398 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:03.589840 ignition[1158]: PUT result: OK Sep 16 04:24:03.590046 ignition[1158]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 16 04:24:03.593540 ignition[1158]: GET result: OK Sep 16 04:24:03.598774 ignition[1158]: parsing config with SHA512: 61038878ef1dd6cde5ed49d1b90c1020bfe3f2a612972461b9ad4d88a962bed06b19b22f97c39e572c1d73603b034436103c139b32b115727bc641692ef3b21c Sep 16 04:24:03.614376 unknown[1158]: fetched base config from "system" Sep 16 04:24:03.615136 ignition[1158]: fetch: fetch complete Sep 16 04:24:03.614399 unknown[1158]: fetched base config from "system" Sep 16 04:24:03.615150 ignition[1158]: fetch: fetch passed Sep 16 04:24:03.614413 unknown[1158]: fetched user config from "aws" Sep 16 04:24:03.615264 ignition[1158]: Ignition finished successfully Sep 16 04:24:03.620097 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 16 04:24:03.627458 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 04:24:03.696671 ignition[1164]: Ignition 2.22.0 Sep 16 04:24:03.696694 ignition[1164]: Stage: kargs Sep 16 04:24:03.697263 ignition[1164]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:03.697289 ignition[1164]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:03.697452 ignition[1164]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:03.710806 ignition[1164]: PUT result: OK Sep 16 04:24:03.717254 ignition[1164]: kargs: kargs passed Sep 16 04:24:03.717708 ignition[1164]: Ignition finished successfully Sep 16 04:24:03.724484 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 04:24:03.726612 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 04:24:03.782747 ignition[1170]: Ignition 2.22.0 Sep 16 04:24:03.783289 ignition[1170]: Stage: disks Sep 16 04:24:03.784019 ignition[1170]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:03.784046 ignition[1170]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:03.784207 ignition[1170]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:03.798825 ignition[1170]: PUT result: OK Sep 16 04:24:03.805878 ignition[1170]: disks: disks passed Sep 16 04:24:03.806249 ignition[1170]: Ignition finished successfully Sep 16 04:24:03.814541 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 04:24:03.819798 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 04:24:03.828992 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 04:24:03.832199 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:24:03.839604 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:24:03.842169 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:24:03.850740 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 04:24:03.910869 systemd-fsck[1179]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 16 04:24:03.918662 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 04:24:03.929830 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 04:24:04.071552 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a00d22d9-68b1-4a84-acfc-9fae1fca53dd r/w with ordered data mode. Quota mode: none. Sep 16 04:24:04.072962 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 04:24:04.073840 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 04:24:04.086857 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:24:04.093104 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 04:24:04.105695 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 16 04:24:04.106110 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 04:24:04.106164 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:24:04.150963 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 04:24:04.157569 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 04:24:04.175553 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1198) Sep 16 04:24:04.180257 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:24:04.180682 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:24:04.189028 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 16 04:24:04.189121 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 16 04:24:04.192895 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:24:04.585081 initrd-setup-root[1222]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 04:24:04.596396 initrd-setup-root[1229]: cut: /sysroot/etc/group: No such file or directory Sep 16 04:24:04.607703 initrd-setup-root[1236]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 04:24:04.619204 initrd-setup-root[1243]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 04:24:04.965949 systemd-networkd[1147]: eth0: Gained IPv6LL Sep 16 04:24:04.982865 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 04:24:04.988158 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 04:24:05.008874 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 04:24:05.031620 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 04:24:05.036547 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:24:05.076557 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 04:24:05.089261 ignition[1311]: INFO : Ignition 2.22.0 Sep 16 04:24:05.091725 ignition[1311]: INFO : Stage: mount Sep 16 04:24:05.091725 ignition[1311]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:05.091725 ignition[1311]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:05.091725 ignition[1311]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:05.103169 ignition[1311]: INFO : PUT result: OK Sep 16 04:24:05.113523 ignition[1311]: INFO : mount: mount passed Sep 16 04:24:05.115656 ignition[1311]: INFO : Ignition finished successfully Sep 16 04:24:05.121852 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 04:24:05.128329 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 04:24:05.155623 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:24:05.195555 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1323) Sep 16 04:24:05.200369 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:24:05.200434 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:24:05.207643 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 16 04:24:05.207710 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 16 04:24:05.211719 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:24:05.268614 ignition[1340]: INFO : Ignition 2.22.0 Sep 16 04:24:05.268614 ignition[1340]: INFO : Stage: files Sep 16 04:24:05.273954 ignition[1340]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:05.273954 ignition[1340]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:05.273954 ignition[1340]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:05.273954 ignition[1340]: INFO : PUT result: OK Sep 16 04:24:05.291612 ignition[1340]: DEBUG : files: compiled without relabeling support, skipping Sep 16 04:24:05.301607 ignition[1340]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 04:24:05.301607 ignition[1340]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 04:24:05.309370 ignition[1340]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 04:24:05.313206 ignition[1340]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 04:24:05.313206 ignition[1340]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 04:24:05.310734 unknown[1340]: wrote ssh authorized keys file for user: core Sep 16 04:24:05.324087 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 16 04:24:05.329560 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 16 04:24:05.433480 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 04:24:05.631903 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 16 04:24:05.631903 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:24:05.647058 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:24:05.679143 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:24:05.684125 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:24:05.684125 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:24:05.696153 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:24:05.696153 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:24:05.696153 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 16 04:24:06.294405 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 04:24:08.900278 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:24:08.900278 ignition[1340]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 04:24:08.911000 ignition[1340]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:24:08.919172 ignition[1340]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:24:08.919172 ignition[1340]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 04:24:08.919172 ignition[1340]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 16 04:24:08.933226 ignition[1340]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 04:24:08.933226 ignition[1340]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:24:08.933226 ignition[1340]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:24:08.933226 ignition[1340]: INFO : files: files passed Sep 16 04:24:08.933226 ignition[1340]: INFO : Ignition finished successfully Sep 16 04:24:08.955589 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 04:24:08.964971 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 04:24:08.980215 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 04:24:08.998899 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 04:24:09.002901 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 04:24:09.032557 initrd-setup-root-after-ignition[1370]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:24:09.032557 initrd-setup-root-after-ignition[1370]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:24:09.047426 initrd-setup-root-after-ignition[1374]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:24:09.047309 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:24:09.051934 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 04:24:09.058115 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 04:24:09.168116 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 04:24:09.168532 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 04:24:09.178038 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 04:24:09.183972 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 04:24:09.187900 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 04:24:09.194462 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 04:24:09.248833 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:24:09.251730 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 04:24:09.294732 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:24:09.299637 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:24:09.306929 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 04:24:09.316150 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 04:24:09.316707 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:24:09.327570 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 04:24:09.330807 systemd[1]: Stopped target basic.target - Basic System. Sep 16 04:24:09.339835 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 04:24:09.343275 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:24:09.354377 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 04:24:09.358450 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:24:09.368105 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 04:24:09.372242 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:24:09.379467 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 04:24:09.389270 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 04:24:09.393181 systemd[1]: Stopped target swap.target - Swaps. Sep 16 04:24:09.401030 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 04:24:09.401303 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:24:09.407781 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:24:09.417337 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:24:09.421288 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 04:24:09.424729 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:24:09.428885 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 04:24:09.429127 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 04:24:09.440276 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 04:24:09.440618 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:24:09.447355 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 04:24:09.447623 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 04:24:09.458941 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 04:24:09.466309 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 04:24:09.469689 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:24:09.485761 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 04:24:09.490717 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 04:24:09.491034 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:24:09.497190 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 04:24:09.497666 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:24:09.541113 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 04:24:09.545379 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 04:24:09.567012 ignition[1394]: INFO : Ignition 2.22.0 Sep 16 04:24:09.567012 ignition[1394]: INFO : Stage: umount Sep 16 04:24:09.573957 ignition[1394]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:24:09.573957 ignition[1394]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 16 04:24:09.573957 ignition[1394]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 16 04:24:09.587401 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 04:24:09.592019 ignition[1394]: INFO : PUT result: OK Sep 16 04:24:09.595409 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 04:24:09.600702 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 04:24:09.601234 ignition[1394]: INFO : umount: umount passed Sep 16 04:24:09.601234 ignition[1394]: INFO : Ignition finished successfully Sep 16 04:24:09.607030 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 04:24:09.607205 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 04:24:09.614971 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 04:24:09.615145 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 04:24:09.620869 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 04:24:09.620980 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 04:24:09.624772 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 16 04:24:09.624891 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 16 04:24:09.629816 systemd[1]: Stopped target network.target - Network. Sep 16 04:24:09.632454 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 04:24:09.632590 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:24:09.640229 systemd[1]: Stopped target paths.target - Path Units. Sep 16 04:24:09.640554 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 04:24:09.658228 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:24:09.662208 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 04:24:09.670538 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 04:24:09.673521 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 04:24:09.673609 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:24:09.682410 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 04:24:09.682528 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:24:09.685795 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 04:24:09.685903 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 04:24:09.693781 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 04:24:09.693869 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 04:24:09.697093 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 04:24:09.697181 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 04:24:09.703099 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 04:24:09.711167 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 04:24:09.743578 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 04:24:09.743952 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 04:24:09.760308 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 04:24:09.760764 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 04:24:09.760975 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 04:24:09.766185 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 04:24:09.767732 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 04:24:09.772103 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 04:24:09.772342 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:24:09.777170 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 04:24:09.799626 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 04:24:09.799757 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:24:09.803973 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 04:24:09.804065 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:24:09.811478 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 04:24:09.815000 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 04:24:09.841642 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 04:24:09.841769 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:24:09.853335 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:24:09.859045 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 04:24:09.859174 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:24:09.882843 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 04:24:09.883563 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:24:09.894953 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 04:24:09.895049 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 04:24:09.899267 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 04:24:09.899350 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:24:09.903135 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 04:24:09.903251 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:24:09.908199 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 04:24:09.908307 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 04:24:09.920673 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 04:24:09.920789 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:24:09.931099 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 04:24:09.949239 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 04:24:09.949529 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:24:09.960434 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 04:24:09.960721 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:24:09.966915 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:24:09.967017 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:24:09.982770 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 16 04:24:09.982901 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 16 04:24:09.982990 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:24:09.984037 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 04:24:09.984305 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 04:24:10.008547 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 04:24:10.009106 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 04:24:10.015610 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 04:24:10.024955 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 04:24:10.076470 systemd[1]: Switching root. Sep 16 04:24:10.126077 systemd-journald[257]: Journal stopped Sep 16 04:24:12.563695 systemd-journald[257]: Received SIGTERM from PID 1 (systemd). Sep 16 04:24:12.563833 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 04:24:12.563879 kernel: SELinux: policy capability open_perms=1 Sep 16 04:24:12.563910 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 04:24:12.563947 kernel: SELinux: policy capability always_check_network=0 Sep 16 04:24:12.563978 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 04:24:12.564007 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 04:24:12.564035 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 04:24:12.564073 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 04:24:12.564103 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 04:24:12.564134 kernel: audit: type=1403 audit(1757996650.419:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 04:24:12.564164 systemd[1]: Successfully loaded SELinux policy in 85.477ms. Sep 16 04:24:12.564216 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 15.425ms. Sep 16 04:24:12.564252 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:24:12.564285 systemd[1]: Detected virtualization amazon. Sep 16 04:24:12.564315 systemd[1]: Detected architecture arm64. Sep 16 04:24:12.564345 systemd[1]: Detected first boot. Sep 16 04:24:12.564380 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:24:12.564411 zram_generator::config[1439]: No configuration found. Sep 16 04:24:12.564459 kernel: NET: Registered PF_VSOCK protocol family Sep 16 04:24:12.564488 systemd[1]: Populated /etc with preset unit settings. Sep 16 04:24:12.567716 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 04:24:12.567769 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 04:24:12.567802 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 04:24:12.567840 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 04:24:12.567881 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 04:24:12.567916 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 04:24:12.567948 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 04:24:12.567983 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 04:24:12.568018 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 04:24:12.568048 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 04:24:12.568086 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 04:24:12.568120 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 04:24:12.568149 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:24:12.568191 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:24:12.568222 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 04:24:12.568251 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 04:24:12.568283 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 04:24:12.568314 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:24:12.568348 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 16 04:24:12.568381 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:24:12.568414 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:24:12.568452 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 04:24:12.568483 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 04:24:12.568556 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 04:24:12.568591 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 04:24:12.568628 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:24:12.568665 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:24:12.568695 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:24:12.568728 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:24:12.568757 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 04:24:12.568797 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 04:24:12.568826 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 04:24:12.568855 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:24:12.568883 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:24:12.568914 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:24:12.568943 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 04:24:12.568976 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 04:24:12.569007 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 04:24:12.569035 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 04:24:12.569070 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 04:24:12.569104 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 04:24:12.569136 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 04:24:12.569167 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 04:24:12.569200 systemd[1]: Reached target machines.target - Containers. Sep 16 04:24:12.569232 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 04:24:12.569263 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:24:12.569294 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:24:12.569331 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 04:24:12.569363 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:24:12.569395 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:24:12.569423 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:24:12.569451 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 04:24:12.569482 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:24:12.571091 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 04:24:12.571166 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 04:24:12.571212 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 04:24:12.571270 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 04:24:12.571317 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 04:24:12.571351 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:24:12.571403 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:24:12.571674 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:24:12.571746 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:24:12.571802 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 04:24:12.571867 kernel: loop: module loaded Sep 16 04:24:12.571944 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 04:24:12.572006 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:24:12.572659 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 04:24:12.572724 systemd[1]: Stopped verity-setup.service. Sep 16 04:24:12.572787 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 04:24:12.572859 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 04:24:12.572916 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 04:24:12.572976 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 04:24:12.573037 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 04:24:12.573096 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 04:24:12.573155 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:24:12.573221 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 04:24:12.573256 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 04:24:12.573326 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:24:12.573363 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:24:12.573398 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:24:12.573428 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:24:12.573461 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:24:12.575594 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:24:12.575672 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:24:12.575705 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 04:24:12.575738 kernel: ACPI: bus type drm_connector registered Sep 16 04:24:12.575770 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:24:12.575799 kernel: fuse: init (API version 7.41) Sep 16 04:24:12.575827 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:24:12.575859 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:24:12.575889 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:24:12.575918 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 04:24:12.575952 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 04:24:12.575982 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 04:24:12.576011 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 04:24:12.576039 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 04:24:12.576068 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:24:12.576100 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 04:24:12.576135 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 04:24:12.576164 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:24:12.576196 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 04:24:12.576228 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:24:12.576259 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 04:24:12.576290 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 04:24:12.576320 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:24:12.576355 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 04:24:12.576384 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:24:12.576483 systemd-journald[1520]: Collecting audit messages is disabled. Sep 16 04:24:12.576580 systemd-journald[1520]: Journal started Sep 16 04:24:12.576634 systemd-journald[1520]: Runtime Journal (/run/log/journal/ec29882b8284d0d8a36e305c4998c043) is 8M, max 75.3M, 67.3M free. Sep 16 04:24:11.648201 systemd[1]: Queued start job for default target multi-user.target. Sep 16 04:24:11.678360 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 16 04:24:11.679861 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 04:24:12.584894 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:24:12.608411 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:24:12.648224 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 04:24:12.657556 kernel: loop0: detected capacity change from 0 to 119368 Sep 16 04:24:12.667154 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 04:24:12.675264 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 04:24:12.689487 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 04:24:12.700306 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 04:24:12.705130 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 04:24:12.717985 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 04:24:12.733337 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 04:24:12.747271 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 04:24:12.780832 systemd-journald[1520]: Time spent on flushing to /var/log/journal/ec29882b8284d0d8a36e305c4998c043 is 123.104ms for 938 entries. Sep 16 04:24:12.780832 systemd-journald[1520]: System Journal (/var/log/journal/ec29882b8284d0d8a36e305c4998c043) is 8M, max 195.6M, 187.6M free. Sep 16 04:24:12.922129 kernel: loop1: detected capacity change from 0 to 100632 Sep 16 04:24:12.922519 systemd-journald[1520]: Received client request to flush runtime journal. Sep 16 04:24:12.922641 kernel: loop2: detected capacity change from 0 to 211168 Sep 16 04:24:12.867332 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 04:24:12.872744 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 04:24:12.931636 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 04:24:12.959105 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 04:24:12.967083 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:24:13.064485 systemd-tmpfiles[1590]: ACLs are not supported, ignoring. Sep 16 04:24:13.064559 systemd-tmpfiles[1590]: ACLs are not supported, ignoring. Sep 16 04:24:13.093646 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:24:13.098243 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:24:13.139577 kernel: loop3: detected capacity change from 0 to 61264 Sep 16 04:24:13.275807 kernel: loop4: detected capacity change from 0 to 119368 Sep 16 04:24:13.306543 kernel: loop5: detected capacity change from 0 to 100632 Sep 16 04:24:13.337544 kernel: loop6: detected capacity change from 0 to 211168 Sep 16 04:24:13.391550 kernel: loop7: detected capacity change from 0 to 61264 Sep 16 04:24:13.412032 (sd-merge)[1596]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 16 04:24:13.419385 (sd-merge)[1596]: Merged extensions into '/usr'. Sep 16 04:24:13.437123 systemd[1]: Reload requested from client PID 1551 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 04:24:13.437189 systemd[1]: Reloading... Sep 16 04:24:13.626548 zram_generator::config[1622]: No configuration found. Sep 16 04:24:13.696937 ldconfig[1547]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 04:24:14.149566 systemd[1]: Reloading finished in 710 ms. Sep 16 04:24:14.192004 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 04:24:14.196086 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 04:24:14.200384 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 04:24:14.224176 systemd[1]: Starting ensure-sysext.service... Sep 16 04:24:14.234922 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:24:14.244595 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:24:14.280097 systemd[1]: Reload requested from client PID 1675 ('systemctl') (unit ensure-sysext.service)... Sep 16 04:24:14.280115 systemd[1]: Reloading... Sep 16 04:24:14.318985 systemd-tmpfiles[1676]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 04:24:14.319082 systemd-tmpfiles[1676]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 04:24:14.319831 systemd-tmpfiles[1676]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 04:24:14.320369 systemd-tmpfiles[1676]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 04:24:14.326036 systemd-tmpfiles[1676]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 04:24:14.330212 systemd-tmpfiles[1676]: ACLs are not supported, ignoring. Sep 16 04:24:14.332210 systemd-tmpfiles[1676]: ACLs are not supported, ignoring. Sep 16 04:24:14.359947 systemd-tmpfiles[1676]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:24:14.360186 systemd-tmpfiles[1676]: Skipping /boot Sep 16 04:24:14.379808 systemd-udevd[1677]: Using default interface naming scheme 'v255'. Sep 16 04:24:14.407038 systemd-tmpfiles[1676]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:24:14.408798 systemd-tmpfiles[1676]: Skipping /boot Sep 16 04:24:14.528575 zram_generator::config[1725]: No configuration found. Sep 16 04:24:15.030895 (udev-worker)[1710]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:24:15.297230 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 16 04:24:15.298301 systemd[1]: Reloading finished in 1016 ms. Sep 16 04:24:15.313946 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:24:15.339942 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:24:15.408774 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:24:15.418559 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 04:24:15.427980 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 04:24:15.439993 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:24:15.449014 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:24:15.530946 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 04:24:15.573281 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:24:15.581003 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:24:15.588232 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:24:15.599769 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:24:15.603097 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:24:15.603379 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:24:15.608851 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 04:24:15.628230 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:24:15.639187 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:24:15.642192 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:24:15.642560 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:24:15.642957 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 04:24:15.662623 systemd[1]: Finished ensure-sysext.service. Sep 16 04:24:15.697881 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:24:15.701662 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:24:15.705928 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:24:15.753856 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 04:24:15.779861 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:24:15.781730 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:24:15.809433 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 04:24:15.821422 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 04:24:15.825598 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 04:24:15.834744 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:24:15.850383 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:24:15.852081 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:24:15.857094 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 04:24:15.865413 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:24:15.866688 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:24:15.871465 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:24:15.894737 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:24:15.945514 augenrules[1907]: No rules Sep 16 04:24:15.947218 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:24:15.959401 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:24:16.124861 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 16 04:24:16.131635 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 04:24:16.169653 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:24:16.187669 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 04:24:16.195772 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 04:24:16.330206 systemd-networkd[1820]: lo: Link UP Sep 16 04:24:16.330761 systemd-networkd[1820]: lo: Gained carrier Sep 16 04:24:16.334215 systemd-networkd[1820]: Enumeration completed Sep 16 04:24:16.334668 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:24:16.336544 systemd-networkd[1820]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:24:16.336750 systemd-networkd[1820]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:24:16.344004 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 04:24:16.351716 systemd-networkd[1820]: eth0: Link UP Sep 16 04:24:16.352253 systemd-networkd[1820]: eth0: Gained carrier Sep 16 04:24:16.352298 systemd-networkd[1820]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:24:16.353128 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 04:24:16.362321 systemd-resolved[1821]: Positive Trust Anchors: Sep 16 04:24:16.362383 systemd-resolved[1821]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:24:16.362486 systemd-resolved[1821]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:24:16.374657 systemd-networkd[1820]: eth0: DHCPv4 address 172.31.31.172/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 16 04:24:16.389339 systemd-resolved[1821]: Defaulting to hostname 'linux'. Sep 16 04:24:16.396722 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:24:16.400264 systemd[1]: Reached target network.target - Network. Sep 16 04:24:16.402948 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:24:16.406151 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:24:16.409020 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 04:24:16.412263 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 04:24:16.415933 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 04:24:16.418953 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 04:24:16.422155 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 04:24:16.425437 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 04:24:16.425520 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:24:16.427827 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:24:16.432235 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 04:24:16.440030 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 04:24:16.447434 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 04:24:16.451185 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 16 04:24:16.454363 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 16 04:24:16.466554 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 04:24:16.469780 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 04:24:16.474484 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 04:24:16.478166 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 04:24:16.483443 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:24:16.486692 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:24:16.489282 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:24:16.489358 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:24:16.492610 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 04:24:16.500895 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 16 04:24:16.512915 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 04:24:16.526955 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 04:24:16.537052 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 04:24:16.547550 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 04:24:16.552101 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 04:24:16.556664 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 04:24:16.573203 systemd[1]: Started ntpd.service - Network Time Service. Sep 16 04:24:16.591920 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 04:24:16.602124 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 16 04:24:16.618973 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 04:24:16.630844 jq[1960]: false Sep 16 04:24:16.631730 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 04:24:16.651013 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 04:24:16.655801 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 16 04:24:16.656846 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 04:24:16.666045 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 04:24:16.684537 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 04:24:16.702090 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 04:24:16.707905 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 04:24:16.709637 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 04:24:16.744364 jq[1975]: true Sep 16 04:24:16.765544 extend-filesystems[1961]: Found /dev/nvme0n1p6 Sep 16 04:24:16.765544 extend-filesystems[1961]: Found /dev/nvme0n1p9 Sep 16 04:24:16.785961 extend-filesystems[1961]: Checking size of /dev/nvme0n1p9 Sep 16 04:24:16.801851 tar[1978]: linux-arm64/LICENSE Sep 16 04:24:16.801851 tar[1978]: linux-arm64/helm Sep 16 04:24:16.782809 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 04:24:16.808439 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 04:24:16.811640 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 04:24:16.873590 extend-filesystems[1961]: Resized partition /dev/nvme0n1p9 Sep 16 04:24:16.876351 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 04:24:16.877948 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 04:24:16.901315 extend-filesystems[2004]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 04:24:16.942602 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 16 04:24:16.956492 jq[1985]: true Sep 16 04:24:16.966053 dbus-daemon[1958]: [system] SELinux support is enabled Sep 16 04:24:16.966644 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 04:24:16.976679 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 04:24:16.976760 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 04:24:16.982996 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 04:24:16.983050 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 04:24:16.993767 ntpd[1963]: ntpd 4.2.8p18@1.4062-o Tue Sep 16 02:36:29 UTC 2025 (1): Starting Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: ntpd 4.2.8p18@1.4062-o Tue Sep 16 02:36:29 UTC 2025 (1): Starting Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: ---------------------------------------------------- Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: ntp-4 is maintained by Network Time Foundation, Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: corporation. Support and training for ntp-4 are Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: available at https://www.nwtime.org/support Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: ---------------------------------------------------- Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: proto: precision = 0.096 usec (-23) Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: basedate set to 2025-09-04 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: gps base set to 2025-09-07 (week 2383) Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Listen and drop on 0 v6wildcard [::]:123 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Listen normally on 2 lo 127.0.0.1:123 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Listen normally on 3 eth0 172.31.31.172:123 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: Listen normally on 4 lo [::1]:123 Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: bind(21) AF_INET6 [fe80::461:b0ff:fe0b:91e9%2]:123 flags 0x811 failed: Cannot assign requested address Sep 16 04:24:17.006894 ntpd[1963]: 16 Sep 04:24:16 ntpd[1963]: unable to create socket on eth0 (5) for [fe80::461:b0ff:fe0b:91e9%2]:123 Sep 16 04:24:16.993900 ntpd[1963]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 16 04:24:17.022805 update_engine[1973]: I20250916 04:24:17.009078 1973 main.cc:92] Flatcar Update Engine starting Sep 16 04:24:17.012222 systemd-coredump[2017]: Process 1963 (ntpd) of user 0 terminated abnormally with signal 11/SEGV, processing... Sep 16 04:24:16.993921 ntpd[1963]: ---------------------------------------------------- Sep 16 04:24:17.027269 (ntainerd)[2008]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 04:24:16.993939 ntpd[1963]: ntp-4 is maintained by Network Time Foundation, Sep 16 04:24:17.027536 systemd[1]: Created slice system-systemd\x2dcoredump.slice - Slice /system/systemd-coredump. Sep 16 04:24:16.993956 ntpd[1963]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 16 04:24:17.039032 systemd[1]: Started systemd-coredump@0-2017-0.service - Process Core Dump (PID 2017/UID 0). Sep 16 04:24:16.993973 ntpd[1963]: corporation. Support and training for ntp-4 are Sep 16 04:24:17.050938 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 16 04:24:16.993989 ntpd[1963]: available at https://www.nwtime.org/support Sep 16 04:24:16.994006 ntpd[1963]: ---------------------------------------------------- Sep 16 04:24:16.998935 ntpd[1963]: proto: precision = 0.096 usec (-23) Sep 16 04:24:17.062960 systemd[1]: Started update-engine.service - Update Engine. Sep 16 04:24:17.000133 ntpd[1963]: basedate set to 2025-09-04 Sep 16 04:24:17.000170 ntpd[1963]: gps base set to 2025-09-07 (week 2383) Sep 16 04:24:17.000389 ntpd[1963]: Listen and drop on 0 v6wildcard [::]:123 Sep 16 04:24:17.000447 ntpd[1963]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 16 04:24:17.000840 ntpd[1963]: Listen normally on 2 lo 127.0.0.1:123 Sep 16 04:24:17.000895 ntpd[1963]: Listen normally on 3 eth0 172.31.31.172:123 Sep 16 04:24:17.000942 ntpd[1963]: Listen normally on 4 lo [::1]:123 Sep 16 04:24:17.000996 ntpd[1963]: bind(21) AF_INET6 [fe80::461:b0ff:fe0b:91e9%2]:123 flags 0x811 failed: Cannot assign requested address Sep 16 04:24:17.001034 ntpd[1963]: unable to create socket on eth0 (5) for [fe80::461:b0ff:fe0b:91e9%2]:123 Sep 16 04:24:17.021368 dbus-daemon[1958]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1820 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 16 04:24:17.080549 coreos-metadata[1957]: Sep 16 04:24:17.074 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 16 04:24:17.081018 update_engine[1973]: I20250916 04:24:17.074388 1973 update_check_scheduler.cc:74] Next update check in 5m47s Sep 16 04:24:17.088441 coreos-metadata[1957]: Sep 16 04:24:17.087 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 16 04:24:17.095042 coreos-metadata[1957]: Sep 16 04:24:17.092 INFO Fetch successful Sep 16 04:24:17.095042 coreos-metadata[1957]: Sep 16 04:24:17.092 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 16 04:24:17.101604 coreos-metadata[1957]: Sep 16 04:24:17.097 INFO Fetch successful Sep 16 04:24:17.101604 coreos-metadata[1957]: Sep 16 04:24:17.097 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 16 04:24:17.106249 coreos-metadata[1957]: Sep 16 04:24:17.106 INFO Fetch successful Sep 16 04:24:17.106249 coreos-metadata[1957]: Sep 16 04:24:17.106 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 16 04:24:17.108596 coreos-metadata[1957]: Sep 16 04:24:17.108 INFO Fetch successful Sep 16 04:24:17.108596 coreos-metadata[1957]: Sep 16 04:24:17.108 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 16 04:24:17.113994 coreos-metadata[1957]: Sep 16 04:24:17.113 INFO Fetch failed with 404: resource not found Sep 16 04:24:17.113994 coreos-metadata[1957]: Sep 16 04:24:17.113 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 16 04:24:17.116848 coreos-metadata[1957]: Sep 16 04:24:17.116 INFO Fetch successful Sep 16 04:24:17.116848 coreos-metadata[1957]: Sep 16 04:24:17.116 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 16 04:24:17.121596 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.124 INFO Fetch successful Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.124 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.125 INFO Fetch successful Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.125 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.133 INFO Fetch successful Sep 16 04:24:17.135284 coreos-metadata[1957]: Sep 16 04:24:17.133 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 16 04:24:17.139406 coreos-metadata[1957]: Sep 16 04:24:17.135 INFO Fetch successful Sep 16 04:24:17.140538 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 04:24:17.141157 extend-filesystems[2004]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 16 04:24:17.141157 extend-filesystems[2004]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 16 04:24:17.141157 extend-filesystems[2004]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 16 04:24:17.177425 extend-filesystems[1961]: Resized filesystem in /dev/nvme0n1p9 Sep 16 04:24:17.148957 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 16 04:24:17.162043 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 04:24:17.165047 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 04:24:17.281957 bash[2056]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:24:17.320665 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 04:24:17.337939 systemd[1]: Starting sshkeys.service... Sep 16 04:24:17.459907 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 16 04:24:17.469516 systemd-logind[1971]: Watching system buttons on /dev/input/event0 (Power Button) Sep 16 04:24:17.469578 systemd-logind[1971]: Watching system buttons on /dev/input/event1 (Sleep Button) Sep 16 04:24:17.471465 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 16 04:24:17.475829 systemd-logind[1971]: New seat seat0. Sep 16 04:24:17.496216 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 04:24:17.514612 systemd-networkd[1820]: eth0: Gained IPv6LL Sep 16 04:24:17.525771 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 04:24:17.533326 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 16 04:24:17.540664 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 04:24:17.566094 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 16 04:24:17.574136 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:24:17.584116 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 04:24:17.588780 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 16 04:24:17.917599 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 04:24:18.052806 containerd[2008]: time="2025-09-16T04:24:18Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 04:24:18.063826 containerd[2008]: time="2025-09-16T04:24:18.061070017Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 04:24:18.105282 coreos-metadata[2086]: Sep 16 04:24:18.104 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 16 04:24:18.106371 coreos-metadata[2086]: Sep 16 04:24:18.106 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 16 04:24:18.107591 coreos-metadata[2086]: Sep 16 04:24:18.107 INFO Fetch successful Sep 16 04:24:18.110022 coreos-metadata[2086]: Sep 16 04:24:18.109 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 16 04:24:18.114515 coreos-metadata[2086]: Sep 16 04:24:18.110 INFO Fetch successful Sep 16 04:24:18.122199 unknown[2086]: wrote ssh authorized keys file for user: core Sep 16 04:24:18.139785 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 16 04:24:18.150817 dbus-daemon[1958]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 16 04:24:18.161324 dbus-daemon[1958]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2020 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 16 04:24:18.176356 systemd[1]: Starting polkit.service - Authorization Manager... Sep 16 04:24:18.220074 amazon-ssm-agent[2103]: Initializing new seelog logger Sep 16 04:24:18.222654 containerd[2008]: time="2025-09-16T04:24:18.221914477Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="17.136µs" Sep 16 04:24:18.222654 containerd[2008]: time="2025-09-16T04:24:18.221970937Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 04:24:18.222654 containerd[2008]: time="2025-09-16T04:24:18.222008629Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 04:24:18.222654 containerd[2008]: time="2025-09-16T04:24:18.222345685Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 04:24:18.222654 containerd[2008]: time="2025-09-16T04:24:18.222383569Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 04:24:18.223535 containerd[2008]: time="2025-09-16T04:24:18.223308637Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.223761145Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.223807297Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.224294773Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.224336605Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.224366317Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227057 containerd[2008]: time="2025-09-16T04:24:18.224390977Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 04:24:18.227403 amazon-ssm-agent[2103]: New Seelog Logger Creation Complete Sep 16 04:24:18.227403 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.227403 amazon-ssm-agent[2103]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.230943 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 processing appconfig overrides Sep 16 04:24:18.230943 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.230943 amazon-ssm-agent[2103]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.230943 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 processing appconfig overrides Sep 16 04:24:18.232526 containerd[2008]: time="2025-09-16T04:24:18.232111634Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 04:24:18.234847 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.234847 amazon-ssm-agent[2103]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.234847 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 processing appconfig overrides Sep 16 04:24:18.235186 containerd[2008]: time="2025-09-16T04:24:18.235106582Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:24:18.237966 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2301 INFO Proxy environment variables: Sep 16 04:24:18.238086 containerd[2008]: time="2025-09-16T04:24:18.237434774Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:24:18.240891 containerd[2008]: time="2025-09-16T04:24:18.240441770Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 04:24:18.240891 containerd[2008]: time="2025-09-16T04:24:18.240630974Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 04:24:18.245717 systemd-coredump[2019]: Process 1963 (ntpd) of user 0 dumped core. Module libnss_usrfiles.so.2 without build-id. Module libgcc_s.so.1 without build-id. Module libc.so.6 without build-id. Module libcrypto.so.3 without build-id. Module libm.so.6 without build-id. Module libcap.so.2 without build-id. Module ntpd without build-id. Stack trace of thread 1963: #0 0x0000aaaace540b5c n/a (ntpd + 0x60b5c) #1 0x0000aaaace4efe60 n/a (ntpd + 0xfe60) #2 0x0000aaaace4f0240 n/a (ntpd + 0x10240) #3 0x0000aaaace4ebe14 n/a (ntpd + 0xbe14) #4 0x0000aaaace4ed3ec n/a (ntpd + 0xd3ec) #5 0x0000aaaace4f5a38 n/a (ntpd + 0x15a38) #6 0x0000aaaace4e738c n/a (ntpd + 0x738c) #7 0x0000ffffa67f2034 n/a (libc.so.6 + 0x22034) #8 0x0000ffffa67f2118 __libc_start_main (libc.so.6 + 0x22118) #9 0x0000aaaace4e73f0 n/a (ntpd + 0x73f0) ELF object binary architecture: AARCH64 Sep 16 04:24:18.252323 containerd[2008]: time="2025-09-16T04:24:18.244867394Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 04:24:18.273436 containerd[2008]: time="2025-09-16T04:24:18.252524678Z" level=info msg="metadata content store policy set" policy=shared Sep 16 04:24:18.265619 systemd[1]: ntpd.service: Main process exited, code=dumped, status=11/SEGV Sep 16 04:24:18.275071 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.275071 amazon-ssm-agent[2103]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:18.275071 amazon-ssm-agent[2103]: 2025/09/16 04:24:18 processing appconfig overrides Sep 16 04:24:18.275840 containerd[2008]: time="2025-09-16T04:24:18.273798098Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 04:24:18.275840 containerd[2008]: time="2025-09-16T04:24:18.273961094Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 04:24:18.275840 containerd[2008]: time="2025-09-16T04:24:18.274171910Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 04:24:18.275840 containerd[2008]: time="2025-09-16T04:24:18.274216586Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 04:24:18.275840 containerd[2008]: time="2025-09-16T04:24:18.274277174Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 04:24:18.265928 systemd[1]: ntpd.service: Failed with result 'core-dump'. Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.274308710Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.276728654Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.278352686Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.278547674Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.279104486Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.279142826Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.279187898Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.283634810Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.283737638Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.283824986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.284760350Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.284871410Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.285562058Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 04:24:18.290263 containerd[2008]: time="2025-09-16T04:24:18.286128254Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 04:24:18.291748 update-ssh-keys[2168]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.292531694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.292618730Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.292659230Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.292694198Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.293125094Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.293188094Z" level=info msg="Start snapshots syncer" Sep 16 04:24:18.303997 containerd[2008]: time="2025-09-16T04:24:18.293258414Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 04:24:18.304465 containerd[2008]: time="2025-09-16T04:24:18.293941574Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 04:24:18.304465 containerd[2008]: time="2025-09-16T04:24:18.294053918Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 04:24:18.311879 containerd[2008]: time="2025-09-16T04:24:18.294249710Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 04:24:18.305488 systemd[1]: systemd-coredump@0-2017-0.service: Deactivated successfully. Sep 16 04:24:18.319627 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 16 04:24:18.324884 containerd[2008]: time="2025-09-16T04:24:18.323790650Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 04:24:18.324884 containerd[2008]: time="2025-09-16T04:24:18.323890238Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 04:24:18.324884 containerd[2008]: time="2025-09-16T04:24:18.323939426Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 04:24:18.324884 containerd[2008]: time="2025-09-16T04:24:18.323972054Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 04:24:18.327574 containerd[2008]: time="2025-09-16T04:24:18.324004190Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 04:24:18.327574 containerd[2008]: time="2025-09-16T04:24:18.325305782Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 04:24:18.327574 containerd[2008]: time="2025-09-16T04:24:18.325348790Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 04:24:18.327574 containerd[2008]: time="2025-09-16T04:24:18.325409198Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 04:24:18.327574 containerd[2008]: time="2025-09-16T04:24:18.325441346Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.325482590Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328077158Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328246862Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328334954Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328397978Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328426310Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328455746Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328543694Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328752818Z" level=info msg="runtime interface created" Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328774814Z" level=info msg="created NRI interface" Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328798970Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328853570Z" level=info msg="Connect containerd service" Sep 16 04:24:18.329395 containerd[2008]: time="2025-09-16T04:24:18.328927910Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 04:24:18.343653 containerd[2008]: time="2025-09-16T04:24:18.343200734Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:24:18.336546 systemd[1]: Finished sshkeys.service. Sep 16 04:24:18.353465 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2302 INFO https_proxy: Sep 16 04:24:18.371447 systemd[1]: ntpd.service: Scheduled restart job, restart counter is at 1. Sep 16 04:24:18.417029 systemd[1]: Started ntpd.service - Network Time Service. Sep 16 04:24:18.480710 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2302 INFO http_proxy: Sep 16 04:24:18.568983 ntpd[2190]: ntpd 4.2.8p18@1.4062-o Tue Sep 16 02:36:29 UTC 2025 (1): Starting Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: ntpd 4.2.8p18@1.4062-o Tue Sep 16 02:36:29 UTC 2025 (1): Starting Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: ---------------------------------------------------- Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: ntp-4 is maintained by Network Time Foundation, Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: corporation. Support and training for ntp-4 are Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: available at https://www.nwtime.org/support Sep 16 04:24:18.571103 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: ---------------------------------------------------- Sep 16 04:24:18.569097 ntpd[2190]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 16 04:24:18.569116 ntpd[2190]: ---------------------------------------------------- Sep 16 04:24:18.569134 ntpd[2190]: ntp-4 is maintained by Network Time Foundation, Sep 16 04:24:18.569150 ntpd[2190]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 16 04:24:18.569166 ntpd[2190]: corporation. Support and training for ntp-4 are Sep 16 04:24:18.569183 ntpd[2190]: available at https://www.nwtime.org/support Sep 16 04:24:18.569199 ntpd[2190]: ---------------------------------------------------- Sep 16 04:24:18.574584 ntpd[2190]: proto: precision = 0.096 usec (-23) Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: proto: precision = 0.096 usec (-23) Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: basedate set to 2025-09-04 Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: gps base set to 2025-09-07 (week 2383) Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen and drop on 0 v6wildcard [::]:123 Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen normally on 2 lo 127.0.0.1:123 Sep 16 04:24:18.576797 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen normally on 3 eth0 172.31.31.172:123 Sep 16 04:24:18.574925 ntpd[2190]: basedate set to 2025-09-04 Sep 16 04:24:18.574950 ntpd[2190]: gps base set to 2025-09-07 (week 2383) Sep 16 04:24:18.575100 ntpd[2190]: Listen and drop on 0 v6wildcard [::]:123 Sep 16 04:24:18.575146 ntpd[2190]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 16 04:24:18.575433 ntpd[2190]: Listen normally on 2 lo 127.0.0.1:123 Sep 16 04:24:18.575478 ntpd[2190]: Listen normally on 3 eth0 172.31.31.172:123 Sep 16 04:24:18.579717 ntpd[2190]: Listen normally on 4 lo [::1]:123 Sep 16 04:24:18.581818 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen normally on 4 lo [::1]:123 Sep 16 04:24:18.581818 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listen normally on 5 eth0 [fe80::461:b0ff:fe0b:91e9%2]:123 Sep 16 04:24:18.581818 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: Listening on routing socket on fd #22 for interface updates Sep 16 04:24:18.581973 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2302 INFO no_proxy: Sep 16 04:24:18.579799 ntpd[2190]: Listen normally on 5 eth0 [fe80::461:b0ff:fe0b:91e9%2]:123 Sep 16 04:24:18.579851 ntpd[2190]: Listening on routing socket on fd #22 for interface updates Sep 16 04:24:18.609889 ntpd[2190]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 16 04:24:18.609971 ntpd[2190]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 16 04:24:18.610162 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 16 04:24:18.610162 ntpd[2190]: 16 Sep 04:24:18 ntpd[2190]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 16 04:24:18.683124 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2303 INFO Checking if agent identity type OnPrem can be assumed Sep 16 04:24:18.707069 containerd[2008]: time="2025-09-16T04:24:18.706911208Z" level=info msg="Start subscribing containerd event" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.707038036Z" level=info msg="Start recovering state" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708822616Z" level=info msg="Start event monitor" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708896560Z" level=info msg="Start cni network conf syncer for default" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708924436Z" level=info msg="Start streaming server" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708945652Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708962968Z" level=info msg="runtime interface starting up..." Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.708983080Z" level=info msg="starting plugins..." Sep 16 04:24:18.709623 containerd[2008]: time="2025-09-16T04:24:18.709016896Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 04:24:18.714466 containerd[2008]: time="2025-09-16T04:24:18.714348544Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 04:24:18.718481 containerd[2008]: time="2025-09-16T04:24:18.718024024Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 04:24:18.721033 containerd[2008]: time="2025-09-16T04:24:18.719808844Z" level=info msg="containerd successfully booted in 0.672817s" Sep 16 04:24:18.719945 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 04:24:18.785118 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.2304 INFO Checking if agent identity type EC2 can be assumed Sep 16 04:24:18.799154 locksmithd[2026]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 04:24:18.827817 polkitd[2169]: Started polkitd version 126 Sep 16 04:24:18.858668 polkitd[2169]: Loading rules from directory /etc/polkit-1/rules.d Sep 16 04:24:18.859284 polkitd[2169]: Loading rules from directory /run/polkit-1/rules.d Sep 16 04:24:18.859366 polkitd[2169]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 16 04:24:18.862695 polkitd[2169]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 16 04:24:18.862781 polkitd[2169]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 16 04:24:18.862871 polkitd[2169]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 16 04:24:18.867441 polkitd[2169]: Finished loading, compiling and executing 2 rules Sep 16 04:24:18.869590 systemd[1]: Started polkit.service - Authorization Manager. Sep 16 04:24:18.877451 dbus-daemon[1958]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 16 04:24:18.878872 polkitd[2169]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 16 04:24:18.884675 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6911 INFO Agent will take identity from EC2 Sep 16 04:24:18.923767 systemd-resolved[1821]: System hostname changed to 'ip-172-31-31-172'. Sep 16 04:24:18.923771 systemd-hostnamed[2020]: Hostname set to (transient) Sep 16 04:24:18.986889 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6993 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 Sep 16 04:24:19.087602 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6993 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Sep 16 04:24:19.185452 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6993 INFO [amazon-ssm-agent] Starting Core Agent Sep 16 04:24:19.245484 tar[1978]: linux-arm64/README.md Sep 16 04:24:19.285775 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6993 INFO [amazon-ssm-agent] Registrar detected. Attempting registration Sep 16 04:24:19.306634 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 04:24:19.341817 sshd_keygen[2003]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 04:24:19.385421 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.6993 INFO [Registrar] Starting registrar module Sep 16 04:24:19.421021 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 04:24:19.433181 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 04:24:19.442708 systemd[1]: Started sshd@0-172.31.31.172:22-147.75.109.163:47274.service - OpenSSH per-connection server daemon (147.75.109.163:47274). Sep 16 04:24:19.487220 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.7102 INFO [EC2Identity] Checking disk for registration info Sep 16 04:24:19.495253 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 04:24:19.498944 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 04:24:19.509316 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 04:24:19.571622 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 04:24:19.580578 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 04:24:19.587342 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.7103 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration Sep 16 04:24:19.595146 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 16 04:24:19.603981 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 04:24:19.687859 amazon-ssm-agent[2103]: 2025-09-16 04:24:18.7103 INFO [EC2Identity] Generating registration keypair Sep 16 04:24:19.742011 amazon-ssm-agent[2103]: 2025/09/16 04:24:19 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:19.742011 amazon-ssm-agent[2103]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 16 04:24:19.742582 amazon-ssm-agent[2103]: 2025/09/16 04:24:19 processing appconfig overrides Sep 16 04:24:19.750119 sshd[2226]: Accepted publickey for core from 147.75.109.163 port 47274 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:19.756258 sshd-session[2226]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.6998 INFO [EC2Identity] Checking write access before registering Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7007 INFO [EC2Identity] Registering EC2 instance with Systems Manager Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7417 INFO [EC2Identity] EC2 registration was successful. Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7417 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7418 INFO [CredentialRefresher] credentialRefresher has started Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7418 INFO [CredentialRefresher] Starting credentials refresher loop Sep 16 04:24:19.777678 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7770 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 16 04:24:19.778762 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7774 INFO [CredentialRefresher] Credentials ready Sep 16 04:24:19.778323 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 04:24:19.783459 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 04:24:19.789539 amazon-ssm-agent[2103]: 2025-09-16 04:24:19.7777 INFO [CredentialRefresher] Next credential rotation will be in 29.999989507 minutes Sep 16 04:24:19.805474 systemd-logind[1971]: New session 1 of user core. Sep 16 04:24:19.833716 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 04:24:19.846049 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 04:24:19.869185 (systemd)[2238]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 04:24:19.875155 systemd-logind[1971]: New session c1 of user core. Sep 16 04:24:20.187438 systemd[2238]: Queued start job for default target default.target. Sep 16 04:24:20.198608 systemd[2238]: Created slice app.slice - User Application Slice. Sep 16 04:24:20.198845 systemd[2238]: Reached target paths.target - Paths. Sep 16 04:24:20.199043 systemd[2238]: Reached target timers.target - Timers. Sep 16 04:24:20.201648 systemd[2238]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 04:24:20.237982 systemd[2238]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 04:24:20.238233 systemd[2238]: Reached target sockets.target - Sockets. Sep 16 04:24:20.238332 systemd[2238]: Reached target basic.target - Basic System. Sep 16 04:24:20.238415 systemd[2238]: Reached target default.target - Main User Target. Sep 16 04:24:20.238532 systemd[2238]: Startup finished in 347ms. Sep 16 04:24:20.238913 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 04:24:20.253762 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 04:24:20.414972 systemd[1]: Started sshd@1-172.31.31.172:22-147.75.109.163:53376.service - OpenSSH per-connection server daemon (147.75.109.163:53376). Sep 16 04:24:20.626262 sshd[2249]: Accepted publickey for core from 147.75.109.163 port 53376 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:20.629422 sshd-session[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:20.639604 systemd-logind[1971]: New session 2 of user core. Sep 16 04:24:20.648301 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 04:24:20.781806 sshd[2252]: Connection closed by 147.75.109.163 port 53376 Sep 16 04:24:20.781664 sshd-session[2249]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:20.796003 systemd[1]: sshd@1-172.31.31.172:22-147.75.109.163:53376.service: Deactivated successfully. Sep 16 04:24:20.805141 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 04:24:20.809872 systemd-logind[1971]: Session 2 logged out. Waiting for processes to exit. Sep 16 04:24:20.813975 amazon-ssm-agent[2103]: 2025-09-16 04:24:20.8124 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 16 04:24:20.833476 systemd[1]: Started sshd@2-172.31.31.172:22-147.75.109.163:53382.service - OpenSSH per-connection server daemon (147.75.109.163:53382). Sep 16 04:24:20.840996 systemd-logind[1971]: Removed session 2. Sep 16 04:24:20.915437 amazon-ssm-agent[2103]: 2025-09-16 04:24:20.8168 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2259) started Sep 16 04:24:21.016107 amazon-ssm-agent[2103]: 2025-09-16 04:24:20.8169 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 16 04:24:21.067341 sshd[2261]: Accepted publickey for core from 147.75.109.163 port 53382 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:21.070202 sshd-session[2261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:21.082818 systemd-logind[1971]: New session 3 of user core. Sep 16 04:24:21.087810 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 04:24:21.222730 sshd[2274]: Connection closed by 147.75.109.163 port 53382 Sep 16 04:24:21.223545 sshd-session[2261]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:21.233315 systemd-logind[1971]: Session 3 logged out. Waiting for processes to exit. Sep 16 04:24:21.234075 systemd[1]: sshd@2-172.31.31.172:22-147.75.109.163:53382.service: Deactivated successfully. Sep 16 04:24:21.237795 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 04:24:21.241648 systemd-logind[1971]: Removed session 3. Sep 16 04:24:21.557333 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:24:21.563736 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 04:24:21.570053 (kubelet)[2284]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:24:21.577627 systemd[1]: Startup finished in 3.736s (kernel) + 11.711s (initrd) + 11.244s (userspace) = 26.692s. Sep 16 04:24:23.132008 kubelet[2284]: E0916 04:24:23.131853 2284 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:24:23.137029 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:24:23.137348 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:24:23.138187 systemd[1]: kubelet.service: Consumed 1.529s CPU time, 259.7M memory peak. Sep 16 04:24:26.049088 systemd-resolved[1821]: Clock change detected. Flushing caches. Sep 16 04:24:31.751924 systemd[1]: Started sshd@3-172.31.31.172:22-147.75.109.163:58638.service - OpenSSH per-connection server daemon (147.75.109.163:58638). Sep 16 04:24:31.947676 sshd[2296]: Accepted publickey for core from 147.75.109.163 port 58638 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:31.950213 sshd-session[2296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:31.958112 systemd-logind[1971]: New session 4 of user core. Sep 16 04:24:31.966988 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 04:24:32.090814 sshd[2299]: Connection closed by 147.75.109.163 port 58638 Sep 16 04:24:32.091887 sshd-session[2296]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:32.099435 systemd[1]: sshd@3-172.31.31.172:22-147.75.109.163:58638.service: Deactivated successfully. Sep 16 04:24:32.102577 systemd[1]: session-4.scope: Deactivated successfully. Sep 16 04:24:32.105073 systemd-logind[1971]: Session 4 logged out. Waiting for processes to exit. Sep 16 04:24:32.107366 systemd-logind[1971]: Removed session 4. Sep 16 04:24:32.130774 systemd[1]: Started sshd@4-172.31.31.172:22-147.75.109.163:58642.service - OpenSSH per-connection server daemon (147.75.109.163:58642). Sep 16 04:24:32.334841 sshd[2305]: Accepted publickey for core from 147.75.109.163 port 58642 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:32.337423 sshd-session[2305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:32.347442 systemd-logind[1971]: New session 5 of user core. Sep 16 04:24:32.357005 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 04:24:32.476318 sshd[2308]: Connection closed by 147.75.109.163 port 58642 Sep 16 04:24:32.477088 sshd-session[2305]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:32.483547 systemd-logind[1971]: Session 5 logged out. Waiting for processes to exit. Sep 16 04:24:32.484309 systemd[1]: sshd@4-172.31.31.172:22-147.75.109.163:58642.service: Deactivated successfully. Sep 16 04:24:32.487479 systemd[1]: session-5.scope: Deactivated successfully. Sep 16 04:24:32.491061 systemd-logind[1971]: Removed session 5. Sep 16 04:24:32.510533 systemd[1]: Started sshd@5-172.31.31.172:22-147.75.109.163:58654.service - OpenSSH per-connection server daemon (147.75.109.163:58654). Sep 16 04:24:32.700028 sshd[2314]: Accepted publickey for core from 147.75.109.163 port 58654 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:32.703101 sshd-session[2314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:32.711132 systemd-logind[1971]: New session 6 of user core. Sep 16 04:24:32.723045 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 04:24:32.846098 sshd[2317]: Connection closed by 147.75.109.163 port 58654 Sep 16 04:24:32.848105 sshd-session[2314]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:32.854587 systemd-logind[1971]: Session 6 logged out. Waiting for processes to exit. Sep 16 04:24:32.856011 systemd[1]: sshd@5-172.31.31.172:22-147.75.109.163:58654.service: Deactivated successfully. Sep 16 04:24:32.858909 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 04:24:32.862266 systemd-logind[1971]: Removed session 6. Sep 16 04:24:32.882252 systemd[1]: Started sshd@6-172.31.31.172:22-147.75.109.163:58670.service - OpenSSH per-connection server daemon (147.75.109.163:58670). Sep 16 04:24:33.075159 sshd[2323]: Accepted publickey for core from 147.75.109.163 port 58670 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:33.076725 sshd-session[2323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:33.084441 systemd-logind[1971]: New session 7 of user core. Sep 16 04:24:33.099989 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 04:24:33.217467 sudo[2327]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 04:24:33.218144 sudo[2327]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:24:33.237622 sudo[2327]: pam_unix(sudo:session): session closed for user root Sep 16 04:24:33.260921 sshd[2326]: Connection closed by 147.75.109.163 port 58670 Sep 16 04:24:33.262108 sshd-session[2323]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:33.269675 systemd[1]: sshd@6-172.31.31.172:22-147.75.109.163:58670.service: Deactivated successfully. Sep 16 04:24:33.274140 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 04:24:33.275655 systemd-logind[1971]: Session 7 logged out. Waiting for processes to exit. Sep 16 04:24:33.278827 systemd-logind[1971]: Removed session 7. Sep 16 04:24:33.298319 systemd[1]: Started sshd@7-172.31.31.172:22-147.75.109.163:58678.service - OpenSSH per-connection server daemon (147.75.109.163:58678). Sep 16 04:24:33.508284 sshd[2333]: Accepted publickey for core from 147.75.109.163 port 58678 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:33.510711 sshd-session[2333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:33.519828 systemd-logind[1971]: New session 8 of user core. Sep 16 04:24:33.529019 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 04:24:33.632817 sudo[2338]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 04:24:33.633430 sudo[2338]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:24:33.635078 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 04:24:33.640017 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:24:33.652034 sudo[2338]: pam_unix(sudo:session): session closed for user root Sep 16 04:24:33.661405 sudo[2337]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 04:24:33.662088 sudo[2337]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:24:33.686297 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:24:33.754609 augenrules[2363]: No rules Sep 16 04:24:33.757516 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:24:33.758842 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:24:33.762556 sudo[2337]: pam_unix(sudo:session): session closed for user root Sep 16 04:24:33.786847 sshd[2336]: Connection closed by 147.75.109.163 port 58678 Sep 16 04:24:33.788601 sshd-session[2333]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:33.800259 systemd[1]: sshd@7-172.31.31.172:22-147.75.109.163:58678.service: Deactivated successfully. Sep 16 04:24:33.805357 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 04:24:33.807940 systemd-logind[1971]: Session 8 logged out. Waiting for processes to exit. Sep 16 04:24:33.828361 systemd[1]: Started sshd@8-172.31.31.172:22-147.75.109.163:58686.service - OpenSSH per-connection server daemon (147.75.109.163:58686). Sep 16 04:24:33.831541 systemd-logind[1971]: Removed session 8. Sep 16 04:24:34.028943 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:24:34.036688 sshd[2372]: Accepted publickey for core from 147.75.109.163 port 58686 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:24:34.039616 sshd-session[2372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:24:34.040257 (kubelet)[2379]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:24:34.051840 systemd-logind[1971]: New session 9 of user core. Sep 16 04:24:34.057038 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 04:24:34.122661 kubelet[2379]: E0916 04:24:34.122603 2379 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:24:34.130366 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:24:34.130892 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:24:34.132009 systemd[1]: kubelet.service: Consumed 323ms CPU time, 105.8M memory peak. Sep 16 04:24:34.163095 sudo[2388]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 04:24:34.163705 sudo[2388]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:24:34.699377 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 04:24:34.722274 (dockerd)[2406]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 04:24:35.107497 dockerd[2406]: time="2025-09-16T04:24:35.107103588Z" level=info msg="Starting up" Sep 16 04:24:35.113719 dockerd[2406]: time="2025-09-16T04:24:35.113673277Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 04:24:35.133536 dockerd[2406]: time="2025-09-16T04:24:35.133470001Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 04:24:35.161295 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1145687557-merged.mount: Deactivated successfully. Sep 16 04:24:35.202473 dockerd[2406]: time="2025-09-16T04:24:35.202228501Z" level=info msg="Loading containers: start." Sep 16 04:24:35.219803 kernel: Initializing XFRM netlink socket Sep 16 04:24:35.552264 (udev-worker)[2427]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:24:35.630953 systemd-networkd[1820]: docker0: Link UP Sep 16 04:24:35.642287 dockerd[2406]: time="2025-09-16T04:24:35.642212307Z" level=info msg="Loading containers: done." Sep 16 04:24:35.666860 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1534718249-merged.mount: Deactivated successfully. Sep 16 04:24:35.675871 dockerd[2406]: time="2025-09-16T04:24:35.675809319Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 04:24:35.676101 dockerd[2406]: time="2025-09-16T04:24:35.675926523Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 04:24:35.676101 dockerd[2406]: time="2025-09-16T04:24:35.676076607Z" level=info msg="Initializing buildkit" Sep 16 04:24:35.727549 dockerd[2406]: time="2025-09-16T04:24:35.727490656Z" level=info msg="Completed buildkit initialization" Sep 16 04:24:35.744343 dockerd[2406]: time="2025-09-16T04:24:35.744267040Z" level=info msg="Daemon has completed initialization" Sep 16 04:24:35.744930 dockerd[2406]: time="2025-09-16T04:24:35.744353596Z" level=info msg="API listen on /run/docker.sock" Sep 16 04:24:35.744802 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 04:24:36.934164 containerd[2008]: time="2025-09-16T04:24:36.934007370Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 16 04:24:37.600214 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3603920765.mount: Deactivated successfully. Sep 16 04:24:39.280118 containerd[2008]: time="2025-09-16T04:24:39.280051325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:39.281888 containerd[2008]: time="2025-09-16T04:24:39.281832497Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=27390228" Sep 16 04:24:39.283791 containerd[2008]: time="2025-09-16T04:24:39.283057409Z" level=info msg="ImageCreate event name:\"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:39.288641 containerd[2008]: time="2025-09-16T04:24:39.287662853Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:39.289830 containerd[2008]: time="2025-09-16T04:24:39.289773641Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"27386827\" in 2.355509147s" Sep 16 04:24:39.289930 containerd[2008]: time="2025-09-16T04:24:39.289833245Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\"" Sep 16 04:24:39.292604 containerd[2008]: time="2025-09-16T04:24:39.292563221Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 16 04:24:41.234487 containerd[2008]: time="2025-09-16T04:24:41.234404851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:41.237306 containerd[2008]: time="2025-09-16T04:24:41.236880991Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=23547917" Sep 16 04:24:41.238445 containerd[2008]: time="2025-09-16T04:24:41.238388755Z" level=info msg="ImageCreate event name:\"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:41.243171 containerd[2008]: time="2025-09-16T04:24:41.243106063Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:41.245394 containerd[2008]: time="2025-09-16T04:24:41.245333887Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"25135832\" in 1.95249415s" Sep 16 04:24:41.245521 containerd[2008]: time="2025-09-16T04:24:41.245391463Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\"" Sep 16 04:24:41.246294 containerd[2008]: time="2025-09-16T04:24:41.246228031Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 16 04:24:43.033842 containerd[2008]: time="2025-09-16T04:24:43.033734336Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:43.035643 containerd[2008]: time="2025-09-16T04:24:43.035570876Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=18295977" Sep 16 04:24:43.038879 containerd[2008]: time="2025-09-16T04:24:43.038066912Z" level=info msg="ImageCreate event name:\"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:43.045067 containerd[2008]: time="2025-09-16T04:24:43.045004676Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:43.052229 containerd[2008]: time="2025-09-16T04:24:43.052169636Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"19883910\" in 1.805884101s" Sep 16 04:24:43.052430 containerd[2008]: time="2025-09-16T04:24:43.052401944Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\"" Sep 16 04:24:43.053206 containerd[2008]: time="2025-09-16T04:24:43.053120600Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 16 04:24:44.381425 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 04:24:44.386125 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:24:44.594662 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1572828716.mount: Deactivated successfully. Sep 16 04:24:44.767590 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:24:44.782276 (kubelet)[2701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:24:44.863860 kubelet[2701]: E0916 04:24:44.863677 2701 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:24:44.872771 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:24:44.873479 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:24:44.875514 systemd[1]: kubelet.service: Consumed 308ms CPU time, 104.6M memory peak. Sep 16 04:24:45.352759 containerd[2008]: time="2025-09-16T04:24:45.350594375Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:45.359380 containerd[2008]: time="2025-09-16T04:24:45.359306519Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=28240106" Sep 16 04:24:45.362479 containerd[2008]: time="2025-09-16T04:24:45.362410607Z" level=info msg="ImageCreate event name:\"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:45.369328 containerd[2008]: time="2025-09-16T04:24:45.369254099Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:45.372800 containerd[2008]: time="2025-09-16T04:24:45.371664395Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"28239125\" in 2.318482559s" Sep 16 04:24:45.372800 containerd[2008]: time="2025-09-16T04:24:45.371728487Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\"" Sep 16 04:24:45.374600 containerd[2008]: time="2025-09-16T04:24:45.374393027Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 16 04:24:46.006060 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1159518718.mount: Deactivated successfully. Sep 16 04:24:47.180958 containerd[2008]: time="2025-09-16T04:24:47.180870096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.184220 containerd[2008]: time="2025-09-16T04:24:47.183754440Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Sep 16 04:24:47.186379 containerd[2008]: time="2025-09-16T04:24:47.186322332Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.192089 containerd[2008]: time="2025-09-16T04:24:47.192025837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.194413 containerd[2008]: time="2025-09-16T04:24:47.194346409Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.819895506s" Sep 16 04:24:47.194413 containerd[2008]: time="2025-09-16T04:24:47.194407525Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 16 04:24:47.196087 containerd[2008]: time="2025-09-16T04:24:47.195951949Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 04:24:47.673813 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1711970263.mount: Deactivated successfully. Sep 16 04:24:47.687652 containerd[2008]: time="2025-09-16T04:24:47.687571827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:24:47.689476 containerd[2008]: time="2025-09-16T04:24:47.689409603Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 16 04:24:47.692246 containerd[2008]: time="2025-09-16T04:24:47.692131167Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:24:47.696907 containerd[2008]: time="2025-09-16T04:24:47.696825891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:24:47.699105 containerd[2008]: time="2025-09-16T04:24:47.698694147Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 502.440626ms" Sep 16 04:24:47.699105 containerd[2008]: time="2025-09-16T04:24:47.698789835Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 16 04:24:47.699582 containerd[2008]: time="2025-09-16T04:24:47.699499239Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 16 04:24:48.226502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1983332509.mount: Deactivated successfully. Sep 16 04:24:49.409903 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 16 04:24:50.697765 containerd[2008]: time="2025-09-16T04:24:50.696956586Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.700417 containerd[2008]: time="2025-09-16T04:24:50.700369122Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465857" Sep 16 04:24:50.702059 containerd[2008]: time="2025-09-16T04:24:50.702019902Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.709427 containerd[2008]: time="2025-09-16T04:24:50.709377642Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.711805 containerd[2008]: time="2025-09-16T04:24:50.711557298Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.011999655s" Sep 16 04:24:50.711805 containerd[2008]: time="2025-09-16T04:24:50.711614358Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 16 04:24:55.024833 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 16 04:24:55.030049 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:24:55.363051 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:24:55.376383 (kubelet)[2851]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:24:55.444352 kubelet[2851]: E0916 04:24:55.444272 2851 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:24:55.449369 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:24:55.449698 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:24:55.450347 systemd[1]: kubelet.service: Consumed 288ms CPU time, 106.3M memory peak. Sep 16 04:25:00.241828 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:25:00.242305 systemd[1]: kubelet.service: Consumed 288ms CPU time, 106.3M memory peak. Sep 16 04:25:00.248322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:25:00.295651 systemd[1]: Reload requested from client PID 2865 ('systemctl') (unit session-9.scope)... Sep 16 04:25:00.295897 systemd[1]: Reloading... Sep 16 04:25:00.540843 zram_generator::config[2912]: No configuration found. Sep 16 04:25:01.060373 systemd[1]: Reloading finished in 763 ms. Sep 16 04:25:01.189552 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:25:01.198435 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:25:01.199005 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:25:01.199111 systemd[1]: kubelet.service: Consumed 250ms CPU time, 95M memory peak. Sep 16 04:25:01.204331 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:25:01.827504 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:25:01.845569 (kubelet)[2974]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:25:01.937947 kubelet[2974]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:25:01.937947 kubelet[2974]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:25:01.937947 kubelet[2974]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:25:01.938495 kubelet[2974]: I0916 04:25:01.938099 2974 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:25:02.226853 kubelet[2974]: I0916 04:25:02.226632 2974 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 16 04:25:02.226853 kubelet[2974]: I0916 04:25:02.226674 2974 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:25:02.227787 kubelet[2974]: I0916 04:25:02.227664 2974 server.go:956] "Client rotation is on, will bootstrap in background" Sep 16 04:25:02.284191 kubelet[2974]: E0916 04:25:02.284118 2974 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://172.31.31.172:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 16 04:25:02.284994 kubelet[2974]: I0916 04:25:02.284803 2974 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:25:02.297945 kubelet[2974]: I0916 04:25:02.297913 2974 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:25:02.304099 kubelet[2974]: I0916 04:25:02.303927 2974 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:25:02.304828 kubelet[2974]: I0916 04:25:02.304782 2974 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:25:02.305206 kubelet[2974]: I0916 04:25:02.304935 2974 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-31-172","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:25:02.305558 kubelet[2974]: I0916 04:25:02.305537 2974 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:25:02.305656 kubelet[2974]: I0916 04:25:02.305639 2974 container_manager_linux.go:303] "Creating device plugin manager" Sep 16 04:25:02.307839 kubelet[2974]: I0916 04:25:02.307814 2974 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:25:02.314860 kubelet[2974]: I0916 04:25:02.314826 2974 kubelet.go:480] "Attempting to sync node with API server" Sep 16 04:25:02.315010 kubelet[2974]: I0916 04:25:02.314990 2974 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:25:02.317636 kubelet[2974]: I0916 04:25:02.317518 2974 kubelet.go:386] "Adding apiserver pod source" Sep 16 04:25:02.319862 kubelet[2974]: I0916 04:25:02.319836 2974 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:25:02.323075 kubelet[2974]: E0916 04:25:02.322286 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.31.172:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-31-172&limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 16 04:25:02.325871 kubelet[2974]: E0916 04:25:02.325785 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.31.172:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 16 04:25:02.326168 kubelet[2974]: I0916 04:25:02.326141 2974 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:25:02.327514 kubelet[2974]: I0916 04:25:02.327486 2974 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 16 04:25:02.327856 kubelet[2974]: W0916 04:25:02.327836 2974 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 04:25:02.339164 kubelet[2974]: I0916 04:25:02.339124 2974 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:25:02.339393 kubelet[2974]: I0916 04:25:02.339375 2974 server.go:1289] "Started kubelet" Sep 16 04:25:02.341077 kubelet[2974]: I0916 04:25:02.340999 2974 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:25:02.344767 kubelet[2974]: I0916 04:25:02.344181 2974 server.go:317] "Adding debug handlers to kubelet server" Sep 16 04:25:02.346268 kubelet[2974]: I0916 04:25:02.346182 2974 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:25:02.347057 kubelet[2974]: I0916 04:25:02.347024 2974 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:25:02.349590 kubelet[2974]: E0916 04:25:02.347425 2974 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.31.172:6443/api/v1/namespaces/default/events\": dial tcp 172.31.31.172:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-31-172.1865a8afbf3e48a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-31-172,UID:ip-172-31-31-172,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-31-172,},FirstTimestamp:2025-09-16 04:25:02.339319972 +0000 UTC m=+0.487093936,LastTimestamp:2025-09-16 04:25:02.339319972 +0000 UTC m=+0.487093936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-31-172,}" Sep 16 04:25:02.355877 kubelet[2974]: I0916 04:25:02.355112 2974 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:25:02.357104 kubelet[2974]: I0916 04:25:02.356956 2974 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:25:02.364537 kubelet[2974]: E0916 04:25:02.362682 2974 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-31-172\" not found" Sep 16 04:25:02.364537 kubelet[2974]: I0916 04:25:02.362768 2974 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:25:02.364537 kubelet[2974]: I0916 04:25:02.363117 2974 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:25:02.364537 kubelet[2974]: I0916 04:25:02.363208 2974 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:25:02.365621 kubelet[2974]: E0916 04:25:02.365570 2974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": dial tcp 172.31.31.172:6443: connect: connection refused" interval="200ms" Sep 16 04:25:02.367132 kubelet[2974]: E0916 04:25:02.365856 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.31.172:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 16 04:25:02.367132 kubelet[2974]: I0916 04:25:02.366448 2974 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 16 04:25:02.367924 kubelet[2974]: I0916 04:25:02.366554 2974 factory.go:223] Registration of the systemd container factory successfully Sep 16 04:25:02.367924 kubelet[2974]: I0916 04:25:02.367631 2974 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:25:02.368880 kubelet[2974]: E0916 04:25:02.368842 2974 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:25:02.371112 kubelet[2974]: I0916 04:25:02.371078 2974 factory.go:223] Registration of the containerd container factory successfully Sep 16 04:25:02.376880 update_engine[1973]: I20250916 04:25:02.376806 1973 update_attempter.cc:509] Updating boot flags... Sep 16 04:25:02.426149 kubelet[2974]: I0916 04:25:02.426086 2974 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 16 04:25:02.426149 kubelet[2974]: I0916 04:25:02.426139 2974 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 16 04:25:02.426333 kubelet[2974]: I0916 04:25:02.426178 2974 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:25:02.426333 kubelet[2974]: I0916 04:25:02.426194 2974 kubelet.go:2436] "Starting kubelet main sync loop" Sep 16 04:25:02.426333 kubelet[2974]: E0916 04:25:02.426258 2974 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:25:02.435812 kubelet[2974]: E0916 04:25:02.435092 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://172.31.31.172:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 16 04:25:02.438986 kubelet[2974]: I0916 04:25:02.438946 2974 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:25:02.439177 kubelet[2974]: I0916 04:25:02.439155 2974 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:25:02.439292 kubelet[2974]: I0916 04:25:02.439274 2974 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:25:02.449135 kubelet[2974]: I0916 04:25:02.448681 2974 policy_none.go:49] "None policy: Start" Sep 16 04:25:02.449135 kubelet[2974]: I0916 04:25:02.448724 2974 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:25:02.449135 kubelet[2974]: I0916 04:25:02.448774 2974 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:25:02.462890 kubelet[2974]: E0916 04:25:02.462849 2974 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-31-172\" not found" Sep 16 04:25:02.476503 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 04:25:02.494526 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 04:25:02.514108 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 04:25:02.521296 kubelet[2974]: E0916 04:25:02.520867 2974 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 16 04:25:02.521296 kubelet[2974]: I0916 04:25:02.521162 2974 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:25:02.521296 kubelet[2974]: I0916 04:25:02.521188 2974 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:25:02.524502 kubelet[2974]: I0916 04:25:02.524457 2974 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:25:02.530257 kubelet[2974]: E0916 04:25:02.530219 2974 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:25:02.531338 kubelet[2974]: E0916 04:25:02.531307 2974 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-31-172\" not found" Sep 16 04:25:02.568500 systemd[1]: Created slice kubepods-burstable-pod91cf6f26f4bcac3ae76982724b9f0b50.slice - libcontainer container kubepods-burstable-pod91cf6f26f4bcac3ae76982724b9f0b50.slice. Sep 16 04:25:02.576507 kubelet[2974]: E0916 04:25:02.576440 2974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": dial tcp 172.31.31.172:6443: connect: connection refused" interval="400ms" Sep 16 04:25:02.590103 kubelet[2974]: E0916 04:25:02.590061 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:02.605178 systemd[1]: Created slice kubepods-burstable-pod6dfc7f8979abb284c49019b31eb7ecef.slice - libcontainer container kubepods-burstable-pod6dfc7f8979abb284c49019b31eb7ecef.slice. Sep 16 04:25:02.613527 kubelet[2974]: E0916 04:25:02.612507 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:02.616598 systemd[1]: Created slice kubepods-burstable-pod00f8436d6ea89eccfb3042af20e3b635.slice - libcontainer container kubepods-burstable-pod00f8436d6ea89eccfb3042af20e3b635.slice. Sep 16 04:25:02.622820 kubelet[2974]: E0916 04:25:02.622183 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:02.625147 kubelet[2974]: I0916 04:25:02.625090 2974 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-31-172" Sep 16 04:25:02.627368 kubelet[2974]: E0916 04:25:02.627194 2974 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.31.172:6443/api/v1/nodes\": dial tcp 172.31.31.172:6443: connect: connection refused" node="ip-172-31-31-172" Sep 16 04:25:02.670603 kubelet[2974]: I0916 04:25:02.669911 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-kubeconfig\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:02.670603 kubelet[2974]: I0916 04:25:02.669978 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/00f8436d6ea89eccfb3042af20e3b635-kubeconfig\") pod \"kube-scheduler-ip-172-31-31-172\" (UID: \"00f8436d6ea89eccfb3042af20e3b635\") " pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:02.670603 kubelet[2974]: I0916 04:25:02.670016 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-ca-certs\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:02.670603 kubelet[2974]: I0916 04:25:02.670051 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:02.670603 kubelet[2974]: I0916 04:25:02.670096 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:02.671208 kubelet[2974]: I0916 04:25:02.670131 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-k8s-certs\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:02.671208 kubelet[2974]: I0916 04:25:02.670165 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:02.671208 kubelet[2974]: I0916 04:25:02.670198 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-k8s-certs\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:02.671208 kubelet[2974]: I0916 04:25:02.670250 2974 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-ca-certs\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:02.831905 kubelet[2974]: I0916 04:25:02.831857 2974 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-31-172" Sep 16 04:25:02.834001 kubelet[2974]: E0916 04:25:02.833933 2974 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.31.172:6443/api/v1/nodes\": dial tcp 172.31.31.172:6443: connect: connection refused" node="ip-172-31-31-172" Sep 16 04:25:02.895101 containerd[2008]: time="2025-09-16T04:25:02.895037071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-31-172,Uid:91cf6f26f4bcac3ae76982724b9f0b50,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:02.915429 containerd[2008]: time="2025-09-16T04:25:02.915098491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-31-172,Uid:6dfc7f8979abb284c49019b31eb7ecef,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:02.924262 containerd[2008]: time="2025-09-16T04:25:02.924185419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-31-172,Uid:00f8436d6ea89eccfb3042af20e3b635,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:02.965027 containerd[2008]: time="2025-09-16T04:25:02.964256023Z" level=info msg="connecting to shim 53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be" address="unix:///run/containerd/s/c08ef118a91a102620d0caba2ec38b26a7e3d010a2efabd0fbb024127c5a0074" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:02.978256 kubelet[2974]: E0916 04:25:02.978166 2974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": dial tcp 172.31.31.172:6443: connect: connection refused" interval="800ms" Sep 16 04:25:03.098173 systemd[1]: Started cri-containerd-53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be.scope - libcontainer container 53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be. Sep 16 04:25:03.129536 containerd[2008]: time="2025-09-16T04:25:03.128984320Z" level=info msg="connecting to shim 9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3" address="unix:///run/containerd/s/a38f1828e3eeb593f29a18d1bc2607f5f301360f19dab1e88971fc9dc9e3e7ca" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:03.144813 kubelet[2974]: E0916 04:25:03.140369 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.31.172:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 16 04:25:03.154415 containerd[2008]: time="2025-09-16T04:25:03.154306936Z" level=info msg="connecting to shim fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e" address="unix:///run/containerd/s/9a0a14fe64a51a073fc86104af322c224b6f6fdc553e6164700458c674a19cb4" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:03.247432 kubelet[2974]: I0916 04:25:03.247382 2974 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-31-172" Sep 16 04:25:03.249002 kubelet[2974]: E0916 04:25:03.248922 2974 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.31.172:6443/api/v1/nodes\": dial tcp 172.31.31.172:6443: connect: connection refused" node="ip-172-31-31-172" Sep 16 04:25:03.271771 kubelet[2974]: E0916 04:25:03.271221 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://172.31.31.172:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 16 04:25:03.371551 containerd[2008]: time="2025-09-16T04:25:03.371237573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-31-172,Uid:91cf6f26f4bcac3ae76982724b9f0b50,Namespace:kube-system,Attempt:0,} returns sandbox id \"53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be\"" Sep 16 04:25:03.372347 systemd[1]: Started cri-containerd-fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e.scope - libcontainer container fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e. Sep 16 04:25:03.387938 systemd[1]: Started cri-containerd-9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3.scope - libcontainer container 9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3. Sep 16 04:25:03.396177 containerd[2008]: time="2025-09-16T04:25:03.396120881Z" level=info msg="CreateContainer within sandbox \"53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 04:25:03.418776 containerd[2008]: time="2025-09-16T04:25:03.418391717Z" level=info msg="Container b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:03.444256 containerd[2008]: time="2025-09-16T04:25:03.444187925Z" level=info msg="CreateContainer within sandbox \"53d4d15a941117e889e7918818e175c40f3a08c588cb320b7f0126e6259b82be\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0\"" Sep 16 04:25:03.447252 containerd[2008]: time="2025-09-16T04:25:03.447141761Z" level=info msg="StartContainer for \"b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0\"" Sep 16 04:25:03.458493 containerd[2008]: time="2025-09-16T04:25:03.458419469Z" level=info msg="connecting to shim b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0" address="unix:///run/containerd/s/c08ef118a91a102620d0caba2ec38b26a7e3d010a2efabd0fbb024127c5a0074" protocol=ttrpc version=3 Sep 16 04:25:03.511115 containerd[2008]: time="2025-09-16T04:25:03.511004586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-31-172,Uid:00f8436d6ea89eccfb3042af20e3b635,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e\"" Sep 16 04:25:03.524286 containerd[2008]: time="2025-09-16T04:25:03.524232306Z" level=info msg="CreateContainer within sandbox \"fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 04:25:03.539332 systemd[1]: Started cri-containerd-b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0.scope - libcontainer container b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0. Sep 16 04:25:03.543728 containerd[2008]: time="2025-09-16T04:25:03.542501754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-31-172,Uid:6dfc7f8979abb284c49019b31eb7ecef,Namespace:kube-system,Attempt:0,} returns sandbox id \"9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3\"" Sep 16 04:25:03.554466 containerd[2008]: time="2025-09-16T04:25:03.554412942Z" level=info msg="Container c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:03.556840 containerd[2008]: time="2025-09-16T04:25:03.556077906Z" level=info msg="CreateContainer within sandbox \"9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 04:25:03.581715 containerd[2008]: time="2025-09-16T04:25:03.581659746Z" level=info msg="CreateContainer within sandbox \"fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\"" Sep 16 04:25:03.583493 containerd[2008]: time="2025-09-16T04:25:03.583435734Z" level=info msg="StartContainer for \"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\"" Sep 16 04:25:03.586516 containerd[2008]: time="2025-09-16T04:25:03.586440882Z" level=info msg="Container 8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:03.587767 kubelet[2974]: E0916 04:25:03.587624 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.31.172:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-31-172&limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 16 04:25:03.589713 containerd[2008]: time="2025-09-16T04:25:03.589650522Z" level=info msg="connecting to shim c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9" address="unix:///run/containerd/s/9a0a14fe64a51a073fc86104af322c224b6f6fdc553e6164700458c674a19cb4" protocol=ttrpc version=3 Sep 16 04:25:03.606684 containerd[2008]: time="2025-09-16T04:25:03.606613182Z" level=info msg="CreateContainer within sandbox \"9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\"" Sep 16 04:25:03.609929 containerd[2008]: time="2025-09-16T04:25:03.607868718Z" level=info msg="StartContainer for \"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\"" Sep 16 04:25:03.612067 containerd[2008]: time="2025-09-16T04:25:03.612000390Z" level=info msg="connecting to shim 8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e" address="unix:///run/containerd/s/a38f1828e3eeb593f29a18d1bc2607f5f301360f19dab1e88971fc9dc9e3e7ca" protocol=ttrpc version=3 Sep 16 04:25:03.644409 systemd[1]: Started cri-containerd-c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9.scope - libcontainer container c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9. Sep 16 04:25:03.647524 kubelet[2974]: E0916 04:25:03.647480 2974 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.31.172:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.31.172:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 16 04:25:03.672624 systemd[1]: Started cri-containerd-8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e.scope - libcontainer container 8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e. Sep 16 04:25:03.690262 containerd[2008]: time="2025-09-16T04:25:03.690172410Z" level=info msg="StartContainer for \"b464983dcfc84d86ffdd53119092b392727c53d4d5bbd6fd5b2622015659c9c0\" returns successfully" Sep 16 04:25:03.779542 kubelet[2974]: E0916 04:25:03.779485 2974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": dial tcp 172.31.31.172:6443: connect: connection refused" interval="1.6s" Sep 16 04:25:03.824199 containerd[2008]: time="2025-09-16T04:25:03.824150011Z" level=info msg="StartContainer for \"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\" returns successfully" Sep 16 04:25:03.849625 containerd[2008]: time="2025-09-16T04:25:03.848513935Z" level=info msg="StartContainer for \"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\" returns successfully" Sep 16 04:25:04.052797 kubelet[2974]: I0916 04:25:04.052447 2974 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-31-172" Sep 16 04:25:04.469909 kubelet[2974]: E0916 04:25:04.468166 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:04.478766 kubelet[2974]: E0916 04:25:04.478478 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:04.486082 kubelet[2974]: E0916 04:25:04.486047 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:05.489291 kubelet[2974]: E0916 04:25:05.489202 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:05.491450 kubelet[2974]: E0916 04:25:05.491110 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:05.602883 kubelet[2974]: E0916 04:25:05.602847 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:06.490795 kubelet[2974]: E0916 04:25:06.490519 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:07.494887 kubelet[2974]: E0916 04:25:07.494824 2974 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:08.062972 kubelet[2974]: E0916 04:25:08.062873 2974 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-31-172\" not found" node="ip-172-31-31-172" Sep 16 04:25:08.083013 kubelet[2974]: I0916 04:25:08.082951 2974 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-31-172" Sep 16 04:25:08.149727 kubelet[2974]: E0916 04:25:08.149298 2974 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ip-172-31-31-172.1865a8afbf3e48a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-31-172,UID:ip-172-31-31-172,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-31-172,},FirstTimestamp:2025-09-16 04:25:02.339319972 +0000 UTC m=+0.487093936,LastTimestamp:2025-09-16 04:25:02.339319972 +0000 UTC m=+0.487093936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-31-172,}" Sep 16 04:25:08.166401 kubelet[2974]: I0916 04:25:08.166357 2974 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:08.279931 kubelet[2974]: E0916 04:25:08.279876 2974 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-31-172\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:08.279931 kubelet[2974]: I0916 04:25:08.279926 2974 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:08.307094 kubelet[2974]: E0916 04:25:08.307032 2974 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-31-172\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:08.307094 kubelet[2974]: I0916 04:25:08.307086 2974 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:08.312707 kubelet[2974]: E0916 04:25:08.312652 2974 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-31-172\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:08.316262 kubelet[2974]: I0916 04:25:08.314936 2974 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:08.326857 kubelet[2974]: I0916 04:25:08.326817 2974 apiserver.go:52] "Watching apiserver" Sep 16 04:25:08.347674 kubelet[2974]: E0916 04:25:08.347106 2974 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-31-172\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:08.364968 kubelet[2974]: I0916 04:25:08.364895 2974 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:25:13.919871 systemd[1]: Reload requested from client PID 3440 ('systemctl') (unit session-9.scope)... Sep 16 04:25:13.919901 systemd[1]: Reloading... Sep 16 04:25:14.139923 zram_generator::config[3487]: No configuration found. Sep 16 04:25:14.640970 systemd[1]: Reloading finished in 720 ms. Sep 16 04:25:14.693472 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:25:14.712463 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:25:14.714235 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:25:14.714335 systemd[1]: kubelet.service: Consumed 1.415s CPU time, 129.9M memory peak. Sep 16 04:25:14.719611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:25:15.097951 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:25:15.117964 (kubelet)[3544]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:25:15.228240 kubelet[3544]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:25:15.228240 kubelet[3544]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:25:15.228240 kubelet[3544]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:25:15.228240 kubelet[3544]: I0916 04:25:15.228102 3544 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:25:15.249674 kubelet[3544]: I0916 04:25:15.249602 3544 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 16 04:25:15.249674 kubelet[3544]: I0916 04:25:15.249652 3544 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:25:15.250117 kubelet[3544]: I0916 04:25:15.250086 3544 server.go:956] "Client rotation is on, will bootstrap in background" Sep 16 04:25:15.252682 kubelet[3544]: I0916 04:25:15.252616 3544 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 16 04:25:15.257486 kubelet[3544]: I0916 04:25:15.257421 3544 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:25:15.270271 kubelet[3544]: I0916 04:25:15.270219 3544 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:25:15.278896 kubelet[3544]: I0916 04:25:15.278760 3544 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:25:15.279528 kubelet[3544]: I0916 04:25:15.279478 3544 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:25:15.285079 kubelet[3544]: I0916 04:25:15.279646 3544 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-31-172","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:25:15.285079 kubelet[3544]: I0916 04:25:15.283155 3544 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:25:15.285079 kubelet[3544]: I0916 04:25:15.283182 3544 container_manager_linux.go:303] "Creating device plugin manager" Sep 16 04:25:15.285079 kubelet[3544]: I0916 04:25:15.283277 3544 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:25:15.285079 kubelet[3544]: I0916 04:25:15.283549 3544 kubelet.go:480] "Attempting to sync node with API server" Sep 16 04:25:15.285525 kubelet[3544]: I0916 04:25:15.283572 3544 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:25:15.286909 kubelet[3544]: I0916 04:25:15.286879 3544 kubelet.go:386] "Adding apiserver pod source" Sep 16 04:25:15.287613 kubelet[3544]: I0916 04:25:15.287578 3544 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:25:15.306767 kubelet[3544]: I0916 04:25:15.305663 3544 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:25:15.306767 kubelet[3544]: I0916 04:25:15.306631 3544 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 16 04:25:15.313917 kubelet[3544]: I0916 04:25:15.313349 3544 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:25:15.313917 kubelet[3544]: I0916 04:25:15.313431 3544 server.go:1289] "Started kubelet" Sep 16 04:25:15.320037 kubelet[3544]: I0916 04:25:15.317538 3544 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:25:15.334172 kubelet[3544]: I0916 04:25:15.334085 3544 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:25:15.343679 kubelet[3544]: I0916 04:25:15.343603 3544 server.go:317] "Adding debug handlers to kubelet server" Sep 16 04:25:15.345759 kubelet[3544]: I0916 04:25:15.345107 3544 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:25:15.348797 kubelet[3544]: I0916 04:25:15.348387 3544 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:25:15.362011 kubelet[3544]: I0916 04:25:15.361946 3544 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:25:15.373873 kubelet[3544]: I0916 04:25:15.373817 3544 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:25:15.374234 kubelet[3544]: I0916 04:25:15.374190 3544 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 16 04:25:15.377565 kubelet[3544]: I0916 04:25:15.376913 3544 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 16 04:25:15.378806 kubelet[3544]: I0916 04:25:15.377896 3544 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 16 04:25:15.378806 kubelet[3544]: I0916 04:25:15.377951 3544 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:25:15.378806 kubelet[3544]: I0916 04:25:15.377967 3544 kubelet.go:2436] "Starting kubelet main sync loop" Sep 16 04:25:15.378806 kubelet[3544]: E0916 04:25:15.378043 3544 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:25:15.378806 kubelet[3544]: I0916 04:25:15.378272 3544 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:25:15.383817 kubelet[3544]: I0916 04:25:15.383696 3544 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:25:15.402367 kubelet[3544]: I0916 04:25:15.402117 3544 factory.go:223] Registration of the systemd container factory successfully Sep 16 04:25:15.406393 kubelet[3544]: I0916 04:25:15.406322 3544 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:25:15.420517 kubelet[3544]: E0916 04:25:15.420474 3544 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:25:15.421551 kubelet[3544]: I0916 04:25:15.420485 3544 factory.go:223] Registration of the containerd container factory successfully Sep 16 04:25:15.481291 kubelet[3544]: E0916 04:25:15.481122 3544 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 16 04:25:15.609152 kubelet[3544]: I0916 04:25:15.609043 3544 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609299 3544 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609337 3544 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609579 3544 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609600 3544 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609632 3544 policy_none.go:49] "None policy: Start" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609660 3544 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:25:15.610996 kubelet[3544]: I0916 04:25:15.609684 3544 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:25:15.611813 kubelet[3544]: I0916 04:25:15.611784 3544 state_mem.go:75] "Updated machine memory state" Sep 16 04:25:15.632387 kubelet[3544]: E0916 04:25:15.631578 3544 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 16 04:25:15.632387 kubelet[3544]: I0916 04:25:15.631867 3544 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:25:15.632387 kubelet[3544]: I0916 04:25:15.631886 3544 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:25:15.641559 kubelet[3544]: I0916 04:25:15.641467 3544 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:25:15.643335 kubelet[3544]: E0916 04:25:15.643099 3544 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:25:15.682593 kubelet[3544]: I0916 04:25:15.682450 3544 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.683211 kubelet[3544]: I0916 04:25:15.683133 3544 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:15.685593 kubelet[3544]: I0916 04:25:15.684971 3544 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:15.777037 kubelet[3544]: I0916 04:25:15.776976 3544 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-31-172" Sep 16 04:25:15.787132 kubelet[3544]: I0916 04:25:15.787063 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-ca-certs\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.787292 kubelet[3544]: I0916 04:25:15.787139 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.787292 kubelet[3544]: I0916 04:25:15.787186 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-kubeconfig\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.787292 kubelet[3544]: I0916 04:25:15.787226 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.787292 kubelet[3544]: I0916 04:25:15.787266 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-ca-certs\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:15.787516 kubelet[3544]: I0916 04:25:15.787303 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-k8s-certs\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:15.787516 kubelet[3544]: I0916 04:25:15.787349 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/91cf6f26f4bcac3ae76982724b9f0b50-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-31-172\" (UID: \"91cf6f26f4bcac3ae76982724b9f0b50\") " pod="kube-system/kube-apiserver-ip-172-31-31-172" Sep 16 04:25:15.787516 kubelet[3544]: I0916 04:25:15.787382 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6dfc7f8979abb284c49019b31eb7ecef-k8s-certs\") pod \"kube-controller-manager-ip-172-31-31-172\" (UID: \"6dfc7f8979abb284c49019b31eb7ecef\") " pod="kube-system/kube-controller-manager-ip-172-31-31-172" Sep 16 04:25:15.787516 kubelet[3544]: I0916 04:25:15.787419 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/00f8436d6ea89eccfb3042af20e3b635-kubeconfig\") pod \"kube-scheduler-ip-172-31-31-172\" (UID: \"00f8436d6ea89eccfb3042af20e3b635\") " pod="kube-system/kube-scheduler-ip-172-31-31-172" Sep 16 04:25:15.810004 kubelet[3544]: I0916 04:25:15.809924 3544 kubelet_node_status.go:124] "Node was previously registered" node="ip-172-31-31-172" Sep 16 04:25:15.810150 kubelet[3544]: I0916 04:25:15.810095 3544 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-31-172" Sep 16 04:25:15.810208 kubelet[3544]: I0916 04:25:15.810162 3544 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 04:25:15.810953 containerd[2008]: time="2025-09-16T04:25:15.810874675Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 04:25:15.812240 kubelet[3544]: I0916 04:25:15.811676 3544 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 04:25:16.299373 kubelet[3544]: I0916 04:25:16.299262 3544 apiserver.go:52] "Watching apiserver" Sep 16 04:25:16.348899 systemd[1]: Created slice kubepods-besteffort-podbfe13aae_134a_4513_a4ac_2c5fcf2ff66d.slice - libcontainer container kubepods-besteffort-podbfe13aae_134a_4513_a4ac_2c5fcf2ff66d.slice. Sep 16 04:25:16.379548 kubelet[3544]: I0916 04:25:16.379474 3544 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:25:16.391181 kubelet[3544]: I0916 04:25:16.390932 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfe13aae-134a-4513-a4ac-2c5fcf2ff66d-lib-modules\") pod \"kube-proxy-zn64g\" (UID: \"bfe13aae-134a-4513-a4ac-2c5fcf2ff66d\") " pod="kube-system/kube-proxy-zn64g" Sep 16 04:25:16.391181 kubelet[3544]: I0916 04:25:16.391000 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnsxr\" (UniqueName: \"kubernetes.io/projected/bfe13aae-134a-4513-a4ac-2c5fcf2ff66d-kube-api-access-qnsxr\") pod \"kube-proxy-zn64g\" (UID: \"bfe13aae-134a-4513-a4ac-2c5fcf2ff66d\") " pod="kube-system/kube-proxy-zn64g" Sep 16 04:25:16.391181 kubelet[3544]: I0916 04:25:16.391047 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/bfe13aae-134a-4513-a4ac-2c5fcf2ff66d-kube-proxy\") pod \"kube-proxy-zn64g\" (UID: \"bfe13aae-134a-4513-a4ac-2c5fcf2ff66d\") " pod="kube-system/kube-proxy-zn64g" Sep 16 04:25:16.391181 kubelet[3544]: I0916 04:25:16.391083 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bfe13aae-134a-4513-a4ac-2c5fcf2ff66d-xtables-lock\") pod \"kube-proxy-zn64g\" (UID: \"bfe13aae-134a-4513-a4ac-2c5fcf2ff66d\") " pod="kube-system/kube-proxy-zn64g" Sep 16 04:25:16.465164 kubelet[3544]: I0916 04:25:16.464654 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-31-172" podStartSLOduration=1.464631462 podStartE2EDuration="1.464631462s" podCreationTimestamp="2025-09-16 04:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:25:16.425933742 +0000 UTC m=+1.293577484" watchObservedRunningTime="2025-09-16 04:25:16.464631462 +0000 UTC m=+1.332275216" Sep 16 04:25:16.504456 kubelet[3544]: I0916 04:25:16.503938 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-31-172" podStartSLOduration=1.503914914 podStartE2EDuration="1.503914914s" podCreationTimestamp="2025-09-16 04:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:25:16.467149098 +0000 UTC m=+1.334792852" watchObservedRunningTime="2025-09-16 04:25:16.503914914 +0000 UTC m=+1.371558680" Sep 16 04:25:16.576068 kubelet[3544]: I0916 04:25:16.575772 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-31-172" podStartSLOduration=1.575729202 podStartE2EDuration="1.575729202s" podCreationTimestamp="2025-09-16 04:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:25:16.506062314 +0000 UTC m=+1.373706068" watchObservedRunningTime="2025-09-16 04:25:16.575729202 +0000 UTC m=+1.443372956" Sep 16 04:25:16.666177 containerd[2008]: time="2025-09-16T04:25:16.666129535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zn64g,Uid:bfe13aae-134a-4513-a4ac-2c5fcf2ff66d,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:16.725296 containerd[2008]: time="2025-09-16T04:25:16.725214775Z" level=info msg="connecting to shim e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8" address="unix:///run/containerd/s/b51daf8b5a3dc066eb9c0a36c0edce12f353468244cb1f59c24f085e7a4870cd" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:16.821321 systemd[1]: Started cri-containerd-e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8.scope - libcontainer container e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8. Sep 16 04:25:16.929851 containerd[2008]: time="2025-09-16T04:25:16.928076540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zn64g,Uid:bfe13aae-134a-4513-a4ac-2c5fcf2ff66d,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8\"" Sep 16 04:25:16.945418 containerd[2008]: time="2025-09-16T04:25:16.945332684Z" level=info msg="CreateContainer within sandbox \"e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 04:25:16.974914 containerd[2008]: time="2025-09-16T04:25:16.971851808Z" level=info msg="Container bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:16.997718 containerd[2008]: time="2025-09-16T04:25:16.997641093Z" level=info msg="CreateContainer within sandbox \"e6ce8ff5a3f7ded927393c29056388039b70a6c58c97d664d546d0d2445e92a8\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb\"" Sep 16 04:25:17.001927 containerd[2008]: time="2025-09-16T04:25:17.001857689Z" level=info msg="StartContainer for \"bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb\"" Sep 16 04:25:17.010219 containerd[2008]: time="2025-09-16T04:25:17.010142081Z" level=info msg="connecting to shim bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb" address="unix:///run/containerd/s/b51daf8b5a3dc066eb9c0a36c0edce12f353468244cb1f59c24f085e7a4870cd" protocol=ttrpc version=3 Sep 16 04:25:17.060048 systemd[1]: Started cri-containerd-bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb.scope - libcontainer container bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb. Sep 16 04:25:17.195432 containerd[2008]: time="2025-09-16T04:25:17.195265218Z" level=info msg="StartContainer for \"bb25deaba3478bfed2ad1dd2770a5ffbdb7f992a61501460604c0d56f535b1fb\" returns successfully" Sep 16 04:25:17.242272 systemd[1]: Created slice kubepods-besteffort-pod8ba6fd8a_cbf5_4417_a4f1_4265afc68c4b.slice - libcontainer container kubepods-besteffort-pod8ba6fd8a_cbf5_4417_a4f1_4265afc68c4b.slice. Sep 16 04:25:17.297962 kubelet[3544]: I0916 04:25:17.297897 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lgd\" (UniqueName: \"kubernetes.io/projected/8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b-kube-api-access-c5lgd\") pod \"tigera-operator-755d956888-qrdps\" (UID: \"8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b\") " pod="tigera-operator/tigera-operator-755d956888-qrdps" Sep 16 04:25:17.298130 kubelet[3544]: I0916 04:25:17.297978 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b-var-lib-calico\") pod \"tigera-operator-755d956888-qrdps\" (UID: \"8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b\") " pod="tigera-operator/tigera-operator-755d956888-qrdps" Sep 16 04:25:17.557824 containerd[2008]: time="2025-09-16T04:25:17.557430151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qrdps,Uid:8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b,Namespace:tigera-operator,Attempt:0,}" Sep 16 04:25:17.642558 containerd[2008]: time="2025-09-16T04:25:17.641644712Z" level=info msg="connecting to shim 5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa" address="unix:///run/containerd/s/af8d23c113e79a050feba4a79fb00c9f170848f6bcd9f57921b49369f76ab553" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:17.713053 systemd[1]: Started cri-containerd-5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa.scope - libcontainer container 5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa. Sep 16 04:25:17.816588 containerd[2008]: time="2025-09-16T04:25:17.815883669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qrdps,Uid:8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa\"" Sep 16 04:25:17.821907 containerd[2008]: time="2025-09-16T04:25:17.821837853Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 04:25:19.428499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount177951957.mount: Deactivated successfully. Sep 16 04:25:20.215762 containerd[2008]: time="2025-09-16T04:25:20.215690157Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:20.217584 containerd[2008]: time="2025-09-16T04:25:20.217531353Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 16 04:25:20.217886 containerd[2008]: time="2025-09-16T04:25:20.217852509Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:20.221393 containerd[2008]: time="2025-09-16T04:25:20.221325633Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:20.223638 containerd[2008]: time="2025-09-16T04:25:20.223549449Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.401197024s" Sep 16 04:25:20.223638 containerd[2008]: time="2025-09-16T04:25:20.223632705Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 16 04:25:20.237957 containerd[2008]: time="2025-09-16T04:25:20.237883353Z" level=info msg="CreateContainer within sandbox \"5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 04:25:20.249444 containerd[2008]: time="2025-09-16T04:25:20.248532429Z" level=info msg="Container 16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:20.261733 containerd[2008]: time="2025-09-16T04:25:20.261663225Z" level=info msg="CreateContainer within sandbox \"5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\"" Sep 16 04:25:20.262810 containerd[2008]: time="2025-09-16T04:25:20.262628889Z" level=info msg="StartContainer for \"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\"" Sep 16 04:25:20.266148 containerd[2008]: time="2025-09-16T04:25:20.266086389Z" level=info msg="connecting to shim 16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c" address="unix:///run/containerd/s/af8d23c113e79a050feba4a79fb00c9f170848f6bcd9f57921b49369f76ab553" protocol=ttrpc version=3 Sep 16 04:25:20.305171 systemd[1]: Started cri-containerd-16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c.scope - libcontainer container 16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c. Sep 16 04:25:20.364227 containerd[2008]: time="2025-09-16T04:25:20.364160409Z" level=info msg="StartContainer for \"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" returns successfully" Sep 16 04:25:20.510655 kubelet[3544]: I0916 04:25:20.510455 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zn64g" podStartSLOduration=4.510433306 podStartE2EDuration="4.510433306s" podCreationTimestamp="2025-09-16 04:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:25:17.501903571 +0000 UTC m=+2.369547325" watchObservedRunningTime="2025-09-16 04:25:20.510433306 +0000 UTC m=+5.378077060" Sep 16 04:25:20.512101 kubelet[3544]: I0916 04:25:20.512011 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-qrdps" podStartSLOduration=1.105147282 podStartE2EDuration="3.511714258s" podCreationTimestamp="2025-09-16 04:25:17 +0000 UTC" firstStartedPulling="2025-09-16 04:25:17.821280561 +0000 UTC m=+2.688924303" lastFinishedPulling="2025-09-16 04:25:20.227847525 +0000 UTC m=+5.095491279" observedRunningTime="2025-09-16 04:25:20.511432654 +0000 UTC m=+5.379076420" watchObservedRunningTime="2025-09-16 04:25:20.511714258 +0000 UTC m=+5.379358000" Sep 16 04:25:28.756617 sudo[2388]: pam_unix(sudo:session): session closed for user root Sep 16 04:25:28.781759 sshd[2385]: Connection closed by 147.75.109.163 port 58686 Sep 16 04:25:28.782791 sshd-session[2372]: pam_unix(sshd:session): session closed for user core Sep 16 04:25:28.792438 systemd[1]: sshd@8-172.31.31.172:22-147.75.109.163:58686.service: Deactivated successfully. Sep 16 04:25:28.801822 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 04:25:28.802220 systemd[1]: session-9.scope: Consumed 11.702s CPU time, 225.4M memory peak. Sep 16 04:25:28.805721 systemd-logind[1971]: Session 9 logged out. Waiting for processes to exit. Sep 16 04:25:28.813565 systemd-logind[1971]: Removed session 9. Sep 16 04:25:38.638732 systemd[1]: Created slice kubepods-besteffort-podd869ee5f_71be_4f24_a025_6c3d5238a8df.slice - libcontainer container kubepods-besteffort-podd869ee5f_71be_4f24_a025_6c3d5238a8df.slice. Sep 16 04:25:38.758554 kubelet[3544]: I0916 04:25:38.758342 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d869ee5f-71be-4f24-a025-6c3d5238a8df-typha-certs\") pod \"calico-typha-766bdd7d76-n6bbl\" (UID: \"d869ee5f-71be-4f24-a025-6c3d5238a8df\") " pod="calico-system/calico-typha-766bdd7d76-n6bbl" Sep 16 04:25:38.758554 kubelet[3544]: I0916 04:25:38.758411 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d869ee5f-71be-4f24-a025-6c3d5238a8df-tigera-ca-bundle\") pod \"calico-typha-766bdd7d76-n6bbl\" (UID: \"d869ee5f-71be-4f24-a025-6c3d5238a8df\") " pod="calico-system/calico-typha-766bdd7d76-n6bbl" Sep 16 04:25:38.758554 kubelet[3544]: I0916 04:25:38.758449 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj94\" (UniqueName: \"kubernetes.io/projected/d869ee5f-71be-4f24-a025-6c3d5238a8df-kube-api-access-tgj94\") pod \"calico-typha-766bdd7d76-n6bbl\" (UID: \"d869ee5f-71be-4f24-a025-6c3d5238a8df\") " pod="calico-system/calico-typha-766bdd7d76-n6bbl" Sep 16 04:25:38.956647 containerd[2008]: time="2025-09-16T04:25:38.955242042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-766bdd7d76-n6bbl,Uid:d869ee5f-71be-4f24-a025-6c3d5238a8df,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:39.012090 containerd[2008]: time="2025-09-16T04:25:39.011791010Z" level=info msg="connecting to shim 82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64" address="unix:///run/containerd/s/3485f500629606368281546da5accbd3b983883b24a902923d6ae1b86e1a20a5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:39.071697 systemd[1]: Created slice kubepods-besteffort-poddf27e98a_82c2_440a_845f_5a3a9e1d1c62.slice - libcontainer container kubepods-besteffort-poddf27e98a_82c2_440a_845f_5a3a9e1d1c62.slice. Sep 16 04:25:39.122968 systemd[1]: Started cri-containerd-82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64.scope - libcontainer container 82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64. Sep 16 04:25:39.161011 kubelet[3544]: I0916 04:25:39.160939 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-lib-modules\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.161011 kubelet[3544]: I0916 04:25:39.161017 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df27e98a-82c2-440a-845f-5a3a9e1d1c62-tigera-ca-bundle\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.161806 kubelet[3544]: I0916 04:25:39.161453 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-cni-log-dir\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.161953 kubelet[3544]: I0916 04:25:39.161886 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzzw\" (UniqueName: \"kubernetes.io/projected/df27e98a-82c2-440a-845f-5a3a9e1d1c62-kube-api-access-lpzzw\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.162482 kubelet[3544]: I0916 04:25:39.162183 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-cni-net-dir\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.162835 kubelet[3544]: I0916 04:25:39.162626 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-var-lib-calico\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.163016 kubelet[3544]: I0916 04:25:39.162971 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/df27e98a-82c2-440a-845f-5a3a9e1d1c62-node-certs\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.163571 kubelet[3544]: I0916 04:25:39.163366 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-xtables-lock\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.163943 kubelet[3544]: I0916 04:25:39.163732 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-policysync\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.164316 kubelet[3544]: I0916 04:25:39.164096 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-cni-bin-dir\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.164897 kubelet[3544]: I0916 04:25:39.164670 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-flexvol-driver-host\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.165098 kubelet[3544]: I0916 04:25:39.165052 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/df27e98a-82c2-440a-845f-5a3a9e1d1c62-var-run-calico\") pod \"calico-node-fn8lj\" (UID: \"df27e98a-82c2-440a-845f-5a3a9e1d1c62\") " pod="calico-system/calico-node-fn8lj" Sep 16 04:25:39.275219 kubelet[3544]: E0916 04:25:39.274610 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.275219 kubelet[3544]: W0916 04:25:39.274644 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.275219 kubelet[3544]: E0916 04:25:39.274678 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.277056 kubelet[3544]: E0916 04:25:39.277018 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.277221 kubelet[3544]: W0916 04:25:39.277195 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.277814 kubelet[3544]: E0916 04:25:39.277364 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.282396 kubelet[3544]: E0916 04:25:39.282358 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.282594 kubelet[3544]: W0916 04:25:39.282567 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.282715 kubelet[3544]: E0916 04:25:39.282691 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.286812 kubelet[3544]: E0916 04:25:39.285927 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.286812 kubelet[3544]: W0916 04:25:39.285963 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.286812 kubelet[3544]: E0916 04:25:39.285994 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.289779 kubelet[3544]: E0916 04:25:39.289293 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:39.294794 kubelet[3544]: E0916 04:25:39.293423 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.294794 kubelet[3544]: W0916 04:25:39.293458 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.294794 kubelet[3544]: E0916 04:25:39.293492 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.295515 kubelet[3544]: E0916 04:25:39.295483 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.297105 kubelet[3544]: W0916 04:25:39.296179 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.297105 kubelet[3544]: E0916 04:25:39.296231 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.301110 kubelet[3544]: E0916 04:25:39.300549 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.301110 kubelet[3544]: W0916 04:25:39.300594 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.301110 kubelet[3544]: E0916 04:25:39.300628 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.302697 kubelet[3544]: E0916 04:25:39.302458 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.303134 kubelet[3544]: W0916 04:25:39.302889 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.303134 kubelet[3544]: E0916 04:25:39.302937 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.304914 kubelet[3544]: E0916 04:25:39.304421 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.304914 kubelet[3544]: W0916 04:25:39.304451 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.304914 kubelet[3544]: E0916 04:25:39.304481 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.305682 kubelet[3544]: E0916 04:25:39.305442 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.305682 kubelet[3544]: W0916 04:25:39.305473 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.305682 kubelet[3544]: E0916 04:25:39.305504 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.306157 kubelet[3544]: E0916 04:25:39.306134 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.306293 kubelet[3544]: W0916 04:25:39.306264 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.306409 kubelet[3544]: E0916 04:25:39.306386 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.307802 kubelet[3544]: E0916 04:25:39.306913 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.307802 kubelet[3544]: W0916 04:25:39.306941 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.307802 kubelet[3544]: E0916 04:25:39.306970 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.308266 kubelet[3544]: E0916 04:25:39.308239 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.308401 kubelet[3544]: W0916 04:25:39.308376 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.308511 kubelet[3544]: E0916 04:25:39.308488 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.310179 kubelet[3544]: E0916 04:25:39.309933 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.310179 kubelet[3544]: W0916 04:25:39.309985 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.310179 kubelet[3544]: E0916 04:25:39.310016 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.310646 kubelet[3544]: E0916 04:25:39.310622 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.310997 kubelet[3544]: W0916 04:25:39.310790 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.310997 kubelet[3544]: E0916 04:25:39.310825 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.311340 kubelet[3544]: E0916 04:25:39.311316 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.311463 kubelet[3544]: W0916 04:25:39.311439 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.311879 kubelet[3544]: E0916 04:25:39.311845 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.313535 kubelet[3544]: E0916 04:25:39.312849 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.313535 kubelet[3544]: W0916 04:25:39.312881 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.313535 kubelet[3544]: E0916 04:25:39.312914 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.314098 kubelet[3544]: E0916 04:25:39.314069 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.314869 kubelet[3544]: W0916 04:25:39.314825 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.315038 kubelet[3544]: E0916 04:25:39.315010 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.315558 kubelet[3544]: E0916 04:25:39.315527 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.316801 kubelet[3544]: W0916 04:25:39.315722 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.316801 kubelet[3544]: E0916 04:25:39.315799 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.317290 kubelet[3544]: E0916 04:25:39.317258 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.317430 kubelet[3544]: W0916 04:25:39.317404 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.317548 kubelet[3544]: E0916 04:25:39.317525 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.318076 kubelet[3544]: E0916 04:25:39.318047 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.318804 kubelet[3544]: W0916 04:25:39.318227 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.318804 kubelet[3544]: E0916 04:25:39.318263 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.321424 kubelet[3544]: E0916 04:25:39.321368 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.321424 kubelet[3544]: W0916 04:25:39.321411 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.321628 kubelet[3544]: E0916 04:25:39.321444 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.325220 kubelet[3544]: E0916 04:25:39.325166 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.325220 kubelet[3544]: W0916 04:25:39.325207 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.325449 kubelet[3544]: E0916 04:25:39.325240 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.326248 kubelet[3544]: E0916 04:25:39.326207 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.326356 kubelet[3544]: W0916 04:25:39.326242 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.326356 kubelet[3544]: E0916 04:25:39.326314 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.327633 kubelet[3544]: E0916 04:25:39.327586 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.327633 kubelet[3544]: W0916 04:25:39.327623 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.329258 kubelet[3544]: E0916 04:25:39.327654 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.329451 kubelet[3544]: E0916 04:25:39.329422 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.329564 kubelet[3544]: W0916 04:25:39.329539 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.329671 kubelet[3544]: E0916 04:25:39.329648 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.330376 kubelet[3544]: E0916 04:25:39.330169 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.330376 kubelet[3544]: W0916 04:25:39.330195 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.330376 kubelet[3544]: E0916 04:25:39.330219 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.330836 kubelet[3544]: E0916 04:25:39.330784 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.331150 kubelet[3544]: W0916 04:25:39.330936 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.331150 kubelet[3544]: E0916 04:25:39.330969 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.331912 kubelet[3544]: E0916 04:25:39.331883 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.332085 kubelet[3544]: W0916 04:25:39.332058 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.332201 kubelet[3544]: E0916 04:25:39.332178 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.338169 kubelet[3544]: E0916 04:25:39.337864 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.338169 kubelet[3544]: W0916 04:25:39.337904 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.338169 kubelet[3544]: E0916 04:25:39.337938 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.339974 kubelet[3544]: E0916 04:25:39.339935 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.340191 kubelet[3544]: W0916 04:25:39.340161 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.340304 kubelet[3544]: E0916 04:25:39.340280 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.341007 kubelet[3544]: E0916 04:25:39.340974 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.342777 kubelet[3544]: W0916 04:25:39.341159 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.342777 kubelet[3544]: E0916 04:25:39.341197 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.343926 kubelet[3544]: E0916 04:25:39.343889 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.344309 kubelet[3544]: W0916 04:25:39.344091 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.344309 kubelet[3544]: E0916 04:25:39.344132 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.345630 kubelet[3544]: E0916 04:25:39.345590 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.346103 kubelet[3544]: W0916 04:25:39.345850 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.346103 kubelet[3544]: E0916 04:25:39.345888 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.348963 kubelet[3544]: E0916 04:25:39.348921 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.349178 kubelet[3544]: W0916 04:25:39.349148 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.349804 kubelet[3544]: E0916 04:25:39.349302 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.350382 kubelet[3544]: E0916 04:25:39.350352 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.350562 kubelet[3544]: W0916 04:25:39.350496 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.350562 kubelet[3544]: E0916 04:25:39.350534 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.351799 kubelet[3544]: E0916 04:25:39.351093 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.352305 kubelet[3544]: W0916 04:25:39.352036 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.352305 kubelet[3544]: E0916 04:25:39.352086 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.352696 kubelet[3544]: E0916 04:25:39.352671 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.352844 kubelet[3544]: W0916 04:25:39.352819 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.353050 kubelet[3544]: E0916 04:25:39.352955 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.354022 kubelet[3544]: E0916 04:25:39.353983 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.354789 kubelet[3544]: W0916 04:25:39.354207 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.354789 kubelet[3544]: E0916 04:25:39.354250 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.355313 kubelet[3544]: E0916 04:25:39.355282 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.355554 kubelet[3544]: W0916 04:25:39.355427 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.355554 kubelet[3544]: E0916 04:25:39.355466 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.357281 kubelet[3544]: E0916 04:25:39.357228 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.357609 kubelet[3544]: W0916 04:25:39.357472 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.357609 kubelet[3544]: E0916 04:25:39.357515 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.359078 kubelet[3544]: E0916 04:25:39.358992 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.359078 kubelet[3544]: W0916 04:25:39.359030 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.359490 kubelet[3544]: E0916 04:25:39.359294 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.359891 kubelet[3544]: E0916 04:25:39.359865 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.360068 kubelet[3544]: W0916 04:25:39.360026 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.360202 kubelet[3544]: E0916 04:25:39.360178 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.368401 kubelet[3544]: E0916 04:25:39.368330 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.368804 kubelet[3544]: W0916 04:25:39.368522 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.368804 kubelet[3544]: E0916 04:25:39.368556 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.368804 kubelet[3544]: I0916 04:25:39.368883 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/faaa3987-239e-436b-9a5d-37bf1f542a64-kubelet-dir\") pod \"csi-node-driver-s2zqb\" (UID: \"faaa3987-239e-436b-9a5d-37bf1f542a64\") " pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:39.370316 kubelet[3544]: E0916 04:25:39.370264 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.370316 kubelet[3544]: W0916 04:25:39.370305 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.371279 kubelet[3544]: E0916 04:25:39.370339 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.372424 kubelet[3544]: E0916 04:25:39.372376 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.372637 kubelet[3544]: W0916 04:25:39.372567 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.372637 kubelet[3544]: E0916 04:25:39.372605 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.373480 kubelet[3544]: E0916 04:25:39.373328 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.373480 kubelet[3544]: W0916 04:25:39.373356 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.373480 kubelet[3544]: E0916 04:25:39.373378 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.374165 kubelet[3544]: I0916 04:25:39.373818 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/faaa3987-239e-436b-9a5d-37bf1f542a64-varrun\") pod \"csi-node-driver-s2zqb\" (UID: \"faaa3987-239e-436b-9a5d-37bf1f542a64\") " pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:39.374868 kubelet[3544]: E0916 04:25:39.374601 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.375196 kubelet[3544]: W0916 04:25:39.375031 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.375928 kubelet[3544]: E0916 04:25:39.375854 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.376215 kubelet[3544]: I0916 04:25:39.376090 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/faaa3987-239e-436b-9a5d-37bf1f542a64-registration-dir\") pod \"csi-node-driver-s2zqb\" (UID: \"faaa3987-239e-436b-9a5d-37bf1f542a64\") " pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:39.377035 kubelet[3544]: E0916 04:25:39.376802 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.377035 kubelet[3544]: W0916 04:25:39.376973 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.377035 kubelet[3544]: E0916 04:25:39.377006 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.380056 kubelet[3544]: E0916 04:25:39.379825 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.380056 kubelet[3544]: W0916 04:25:39.379858 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.380056 kubelet[3544]: E0916 04:25:39.379891 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.382909 kubelet[3544]: E0916 04:25:39.382870 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.383282 kubelet[3544]: W0916 04:25:39.383048 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.383282 kubelet[3544]: E0916 04:25:39.383087 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.383599 kubelet[3544]: E0916 04:25:39.383579 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.383706 kubelet[3544]: W0916 04:25:39.383684 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.383986 kubelet[3544]: E0916 04:25:39.383815 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.384308 kubelet[3544]: E0916 04:25:39.384285 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.384538 kubelet[3544]: W0916 04:25:39.384510 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.385339 kubelet[3544]: E0916 04:25:39.384792 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.385339 kubelet[3544]: I0916 04:25:39.384864 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6hd\" (UniqueName: \"kubernetes.io/projected/faaa3987-239e-436b-9a5d-37bf1f542a64-kube-api-access-7l6hd\") pod \"csi-node-driver-s2zqb\" (UID: \"faaa3987-239e-436b-9a5d-37bf1f542a64\") " pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:39.388178 kubelet[3544]: E0916 04:25:39.387250 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.388965 kubelet[3544]: W0916 04:25:39.388600 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.388965 kubelet[3544]: E0916 04:25:39.388652 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.388965 kubelet[3544]: I0916 04:25:39.388697 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/faaa3987-239e-436b-9a5d-37bf1f542a64-socket-dir\") pod \"csi-node-driver-s2zqb\" (UID: \"faaa3987-239e-436b-9a5d-37bf1f542a64\") " pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:39.390694 kubelet[3544]: E0916 04:25:39.390482 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.390694 kubelet[3544]: W0916 04:25:39.390515 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.390694 kubelet[3544]: E0916 04:25:39.390545 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.395730 kubelet[3544]: E0916 04:25:39.394308 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.396774 kubelet[3544]: W0916 04:25:39.396200 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.399251 kubelet[3544]: E0916 04:25:39.398508 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.400380 kubelet[3544]: E0916 04:25:39.400318 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.400380 kubelet[3544]: W0916 04:25:39.400362 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.400546 kubelet[3544]: E0916 04:25:39.400402 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.403010 kubelet[3544]: E0916 04:25:39.402948 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.403182 kubelet[3544]: W0916 04:25:39.403049 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.403182 kubelet[3544]: E0916 04:25:39.403082 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.406208 containerd[2008]: time="2025-09-16T04:25:39.406144528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fn8lj,Uid:df27e98a-82c2-440a-845f-5a3a9e1d1c62,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:39.465568 containerd[2008]: time="2025-09-16T04:25:39.465498040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-766bdd7d76-n6bbl,Uid:d869ee5f-71be-4f24-a025-6c3d5238a8df,Namespace:calico-system,Attempt:0,} returns sandbox id \"82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64\"" Sep 16 04:25:39.470431 containerd[2008]: time="2025-09-16T04:25:39.468595612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 04:25:39.489217 containerd[2008]: time="2025-09-16T04:25:39.489135016Z" level=info msg="connecting to shim e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1" address="unix:///run/containerd/s/d7d3fb393d27c594bf0192da39d757667c453a16c81d94a2a28da61015d62837" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:25:39.492700 kubelet[3544]: E0916 04:25:39.492605 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.492700 kubelet[3544]: W0916 04:25:39.492651 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.492700 kubelet[3544]: E0916 04:25:39.492684 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.496125 kubelet[3544]: E0916 04:25:39.496017 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.496503 kubelet[3544]: W0916 04:25:39.496143 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.496503 kubelet[3544]: E0916 04:25:39.496179 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.498662 kubelet[3544]: E0916 04:25:39.498534 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.499302 kubelet[3544]: W0916 04:25:39.499034 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.499302 kubelet[3544]: E0916 04:25:39.499082 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.500250 kubelet[3544]: E0916 04:25:39.500135 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.500250 kubelet[3544]: W0916 04:25:39.500169 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.500250 kubelet[3544]: E0916 04:25:39.500199 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.501567 kubelet[3544]: E0916 04:25:39.501307 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.501567 kubelet[3544]: W0916 04:25:39.501341 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.501567 kubelet[3544]: E0916 04:25:39.501373 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.503328 kubelet[3544]: E0916 04:25:39.503172 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.503601 kubelet[3544]: W0916 04:25:39.503485 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.503601 kubelet[3544]: E0916 04:25:39.503529 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.505148 kubelet[3544]: E0916 04:25:39.504968 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.505148 kubelet[3544]: W0916 04:25:39.505002 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.505148 kubelet[3544]: E0916 04:25:39.505033 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.508076 kubelet[3544]: E0916 04:25:39.507795 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.508076 kubelet[3544]: W0916 04:25:39.507830 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.508076 kubelet[3544]: E0916 04:25:39.507862 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.509354 kubelet[3544]: E0916 04:25:39.509237 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.509354 kubelet[3544]: W0916 04:25:39.509290 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.509354 kubelet[3544]: E0916 04:25:39.509320 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.510190 kubelet[3544]: E0916 04:25:39.510100 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.510190 kubelet[3544]: W0916 04:25:39.510130 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.510190 kubelet[3544]: E0916 04:25:39.510160 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.512063 kubelet[3544]: E0916 04:25:39.511929 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.512866 kubelet[3544]: W0916 04:25:39.512243 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.512866 kubelet[3544]: E0916 04:25:39.512287 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.516092 kubelet[3544]: E0916 04:25:39.514751 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.516092 kubelet[3544]: W0916 04:25:39.514815 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.516092 kubelet[3544]: E0916 04:25:39.514853 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.517155 kubelet[3544]: E0916 04:25:39.516864 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.517155 kubelet[3544]: W0916 04:25:39.516908 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.517155 kubelet[3544]: E0916 04:25:39.516941 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.519333 kubelet[3544]: E0916 04:25:39.518836 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.519333 kubelet[3544]: W0916 04:25:39.518895 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.519333 kubelet[3544]: E0916 04:25:39.518928 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.520586 kubelet[3544]: E0916 04:25:39.520121 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.520586 kubelet[3544]: W0916 04:25:39.520155 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.520586 kubelet[3544]: E0916 04:25:39.520186 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.521867 kubelet[3544]: E0916 04:25:39.521815 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.522069 kubelet[3544]: W0916 04:25:39.521997 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.522069 kubelet[3544]: E0916 04:25:39.522038 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.526070 kubelet[3544]: E0916 04:25:39.524503 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.526070 kubelet[3544]: W0916 04:25:39.524535 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.526070 kubelet[3544]: E0916 04:25:39.524572 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.526979 kubelet[3544]: E0916 04:25:39.526453 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.526979 kubelet[3544]: W0916 04:25:39.526510 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.526979 kubelet[3544]: E0916 04:25:39.526546 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.529005 kubelet[3544]: E0916 04:25:39.528974 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.532909 kubelet[3544]: W0916 04:25:39.532156 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.532909 kubelet[3544]: E0916 04:25:39.532213 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.534528 kubelet[3544]: E0916 04:25:39.533535 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.537473 kubelet[3544]: W0916 04:25:39.536215 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.537473 kubelet[3544]: E0916 04:25:39.536267 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.541767 kubelet[3544]: E0916 04:25:39.539469 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.541767 kubelet[3544]: W0916 04:25:39.539509 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.541767 kubelet[3544]: E0916 04:25:39.539543 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.544054 kubelet[3544]: E0916 04:25:39.542977 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.544054 kubelet[3544]: W0916 04:25:39.543019 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.544054 kubelet[3544]: E0916 04:25:39.543054 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.546088 kubelet[3544]: E0916 04:25:39.546035 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.546088 kubelet[3544]: W0916 04:25:39.546077 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.546472 kubelet[3544]: E0916 04:25:39.546111 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.548394 kubelet[3544]: E0916 04:25:39.548323 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.548394 kubelet[3544]: W0916 04:25:39.548386 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.548622 kubelet[3544]: E0916 04:25:39.548422 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.550753 kubelet[3544]: E0916 04:25:39.550686 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.551249 kubelet[3544]: W0916 04:25:39.551064 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.551390 kubelet[3544]: E0916 04:25:39.551347 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.603045 systemd[1]: Started cri-containerd-e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1.scope - libcontainer container e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1. Sep 16 04:25:39.604134 kubelet[3544]: E0916 04:25:39.604073 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:39.604134 kubelet[3544]: W0916 04:25:39.604101 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:39.604264 kubelet[3544]: E0916 04:25:39.604132 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:39.729928 containerd[2008]: time="2025-09-16T04:25:39.729699953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fn8lj,Uid:df27e98a-82c2-440a-845f-5a3a9e1d1c62,Namespace:calico-system,Attempt:0,} returns sandbox id \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\"" Sep 16 04:25:40.676385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3598476700.mount: Deactivated successfully. Sep 16 04:25:41.383051 kubelet[3544]: E0916 04:25:41.381122 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:41.762138 containerd[2008]: time="2025-09-16T04:25:41.761832956Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:41.765861 containerd[2008]: time="2025-09-16T04:25:41.765795776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 16 04:25:41.766570 containerd[2008]: time="2025-09-16T04:25:41.766511192Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:41.774014 containerd[2008]: time="2025-09-16T04:25:41.773943032Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:41.777842 containerd[2008]: time="2025-09-16T04:25:41.777722876Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.307370632s" Sep 16 04:25:41.777842 containerd[2008]: time="2025-09-16T04:25:41.777835928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 16 04:25:41.783101 containerd[2008]: time="2025-09-16T04:25:41.783031652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 04:25:41.827276 containerd[2008]: time="2025-09-16T04:25:41.827204612Z" level=info msg="CreateContainer within sandbox \"82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 04:25:41.841004 containerd[2008]: time="2025-09-16T04:25:41.840934532Z" level=info msg="Container d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:41.863310 containerd[2008]: time="2025-09-16T04:25:41.863216444Z" level=info msg="CreateContainer within sandbox \"82b81e36294b7df24ae8fd21234fa4aafcf0b7f6094ecec7ea515e0602c4bb64\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd\"" Sep 16 04:25:41.864976 containerd[2008]: time="2025-09-16T04:25:41.864916112Z" level=info msg="StartContainer for \"d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd\"" Sep 16 04:25:41.867506 containerd[2008]: time="2025-09-16T04:25:41.867434588Z" level=info msg="connecting to shim d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd" address="unix:///run/containerd/s/3485f500629606368281546da5accbd3b983883b24a902923d6ae1b86e1a20a5" protocol=ttrpc version=3 Sep 16 04:25:41.938345 systemd[1]: Started cri-containerd-d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd.scope - libcontainer container d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd. Sep 16 04:25:42.063653 containerd[2008]: time="2025-09-16T04:25:42.063577037Z" level=info msg="StartContainer for \"d113c90f738778108402255502fb1eca06d6537bb6c7515bd780951910d93fdd\" returns successfully" Sep 16 04:25:42.691591 kubelet[3544]: E0916 04:25:42.691541 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.691591 kubelet[3544]: W0916 04:25:42.691579 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.692317 kubelet[3544]: E0916 04:25:42.691634 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.693052 kubelet[3544]: E0916 04:25:42.692972 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.693164 kubelet[3544]: W0916 04:25:42.693033 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.693164 kubelet[3544]: E0916 04:25:42.693135 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.693670 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.693753 kubelet[3544]: W0916 04:25:42.693726 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.693794 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.694188 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.693753 kubelet[3544]: W0916 04:25:42.694230 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.694252 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.694603 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.693753 kubelet[3544]: W0916 04:25:42.694621 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.693753 kubelet[3544]: E0916 04:25:42.694669 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.696263 kubelet[3544]: E0916 04:25:42.696211 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.696263 kubelet[3544]: W0916 04:25:42.696251 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.696420 kubelet[3544]: E0916 04:25:42.696285 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.697479 kubelet[3544]: E0916 04:25:42.697417 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.698891 kubelet[3544]: W0916 04:25:42.697559 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.698891 kubelet[3544]: E0916 04:25:42.697594 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.699172 kubelet[3544]: E0916 04:25:42.699112 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.699172 kubelet[3544]: W0916 04:25:42.699153 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.699280 kubelet[3544]: E0916 04:25:42.699186 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.700340 kubelet[3544]: E0916 04:25:42.700166 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.700340 kubelet[3544]: W0916 04:25:42.700330 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.700537 kubelet[3544]: E0916 04:25:42.700365 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.701986 kubelet[3544]: E0916 04:25:42.701928 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.701986 kubelet[3544]: W0916 04:25:42.701972 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.702156 kubelet[3544]: E0916 04:25:42.702006 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.703143 kubelet[3544]: E0916 04:25:42.703066 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.703143 kubelet[3544]: W0916 04:25:42.703134 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.703968 kubelet[3544]: E0916 04:25:42.703167 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.703968 kubelet[3544]: E0916 04:25:42.703560 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.703968 kubelet[3544]: W0916 04:25:42.703583 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.703968 kubelet[3544]: E0916 04:25:42.703607 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.704196 kubelet[3544]: E0916 04:25:42.704127 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.704196 kubelet[3544]: W0916 04:25:42.704149 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.704196 kubelet[3544]: E0916 04:25:42.704172 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.705503 kubelet[3544]: E0916 04:25:42.705449 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.705503 kubelet[3544]: W0916 04:25:42.705490 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.705668 kubelet[3544]: E0916 04:25:42.705523 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.707113 kubelet[3544]: E0916 04:25:42.707057 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.707239 kubelet[3544]: W0916 04:25:42.707130 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.707239 kubelet[3544]: E0916 04:25:42.707165 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.747772 kubelet[3544]: E0916 04:25:42.746975 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.747772 kubelet[3544]: W0916 04:25:42.747019 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.747772 kubelet[3544]: E0916 04:25:42.747076 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.747772 kubelet[3544]: E0916 04:25:42.747609 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.747772 kubelet[3544]: W0916 04:25:42.747630 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.747772 kubelet[3544]: E0916 04:25:42.747652 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.750788 kubelet[3544]: E0916 04:25:42.749358 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.750788 kubelet[3544]: W0916 04:25:42.749397 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.750788 kubelet[3544]: E0916 04:25:42.749430 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.752262 kubelet[3544]: E0916 04:25:42.752211 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.752262 kubelet[3544]: W0916 04:25:42.752249 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.752496 kubelet[3544]: E0916 04:25:42.752281 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.752685 kubelet[3544]: E0916 04:25:42.752651 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.752685 kubelet[3544]: W0916 04:25:42.752679 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.752926 kubelet[3544]: E0916 04:25:42.752703 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.753963 kubelet[3544]: E0916 04:25:42.753912 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.753963 kubelet[3544]: W0916 04:25:42.753953 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.754982 kubelet[3544]: E0916 04:25:42.753987 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.754982 kubelet[3544]: E0916 04:25:42.754411 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.754982 kubelet[3544]: W0916 04:25:42.754434 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.754982 kubelet[3544]: E0916 04:25:42.754459 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.755399 kubelet[3544]: E0916 04:25:42.755359 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.755649 kubelet[3544]: W0916 04:25:42.755472 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.755838 kubelet[3544]: E0916 04:25:42.755654 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.756484 kubelet[3544]: E0916 04:25:42.756433 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.756608 kubelet[3544]: W0916 04:25:42.756510 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.756608 kubelet[3544]: E0916 04:25:42.756544 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.758123 kubelet[3544]: E0916 04:25:42.758070 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.758123 kubelet[3544]: W0916 04:25:42.758110 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.758708 kubelet[3544]: E0916 04:25:42.758145 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.758708 kubelet[3544]: E0916 04:25:42.758529 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.758708 kubelet[3544]: W0916 04:25:42.758549 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.758708 kubelet[3544]: E0916 04:25:42.758572 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.759832 kubelet[3544]: E0916 04:25:42.758949 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.759832 kubelet[3544]: W0916 04:25:42.758978 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.759832 kubelet[3544]: E0916 04:25:42.759003 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.760222 kubelet[3544]: E0916 04:25:42.760022 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.760222 kubelet[3544]: W0916 04:25:42.760050 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.760222 kubelet[3544]: E0916 04:25:42.760080 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.760505 kubelet[3544]: E0916 04:25:42.760461 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.760505 kubelet[3544]: W0916 04:25:42.760499 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.760628 kubelet[3544]: E0916 04:25:42.760522 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.762300 kubelet[3544]: E0916 04:25:42.762248 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.762300 kubelet[3544]: W0916 04:25:42.762288 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.762489 kubelet[3544]: E0916 04:25:42.762322 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.763120 kubelet[3544]: E0916 04:25:42.763073 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.763120 kubelet[3544]: W0916 04:25:42.763111 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.763373 kubelet[3544]: E0916 04:25:42.763144 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.763581 kubelet[3544]: E0916 04:25:42.763546 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.763581 kubelet[3544]: W0916 04:25:42.763576 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.763685 kubelet[3544]: E0916 04:25:42.763602 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:42.765166 kubelet[3544]: E0916 04:25:42.765115 3544 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:25:42.765166 kubelet[3544]: W0916 04:25:42.765155 3544 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:25:42.765358 kubelet[3544]: E0916 04:25:42.765188 3544 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:25:43.378646 kubelet[3544]: E0916 04:25:43.378473 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:43.390711 containerd[2008]: time="2025-09-16T04:25:43.390646172Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:43.392527 containerd[2008]: time="2025-09-16T04:25:43.392473052Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 16 04:25:43.395780 containerd[2008]: time="2025-09-16T04:25:43.395365376Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:43.400575 containerd[2008]: time="2025-09-16T04:25:43.400511852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:43.402012 containerd[2008]: time="2025-09-16T04:25:43.401965688Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.61870984s" Sep 16 04:25:43.402196 containerd[2008]: time="2025-09-16T04:25:43.402165704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 16 04:25:43.413806 containerd[2008]: time="2025-09-16T04:25:43.413635616Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 04:25:43.435305 containerd[2008]: time="2025-09-16T04:25:43.435231236Z" level=info msg="Container d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:43.459071 containerd[2008]: time="2025-09-16T04:25:43.459003404Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\"" Sep 16 04:25:43.460280 containerd[2008]: time="2025-09-16T04:25:43.460175132Z" level=info msg="StartContainer for \"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\"" Sep 16 04:25:43.463761 containerd[2008]: time="2025-09-16T04:25:43.463678736Z" level=info msg="connecting to shim d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b" address="unix:///run/containerd/s/d7d3fb393d27c594bf0192da39d757667c453a16c81d94a2a28da61015d62837" protocol=ttrpc version=3 Sep 16 04:25:43.506042 systemd[1]: Started cri-containerd-d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b.scope - libcontainer container d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b. Sep 16 04:25:43.608254 containerd[2008]: time="2025-09-16T04:25:43.608147421Z" level=info msg="StartContainer for \"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\" returns successfully" Sep 16 04:25:43.619840 kubelet[3544]: I0916 04:25:43.619568 3544 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:25:43.631576 systemd[1]: cri-containerd-d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b.scope: Deactivated successfully. Sep 16 04:25:43.641600 containerd[2008]: time="2025-09-16T04:25:43.641527713Z" level=info msg="received exit event container_id:\"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\" id:\"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\" pid:4244 exited_at:{seconds:1757996743 nanos:640715001}" Sep 16 04:25:43.642639 containerd[2008]: time="2025-09-16T04:25:43.642576381Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\" id:\"d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b\" pid:4244 exited_at:{seconds:1757996743 nanos:640715001}" Sep 16 04:25:43.664374 kubelet[3544]: I0916 04:25:43.664263 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-766bdd7d76-n6bbl" podStartSLOduration=3.350233385 podStartE2EDuration="5.664236273s" podCreationTimestamp="2025-09-16 04:25:38 +0000 UTC" firstStartedPulling="2025-09-16 04:25:39.46810108 +0000 UTC m=+24.335744834" lastFinishedPulling="2025-09-16 04:25:41.782103884 +0000 UTC m=+26.649747722" observedRunningTime="2025-09-16 04:25:42.698634656 +0000 UTC m=+27.566278398" watchObservedRunningTime="2025-09-16 04:25:43.664236273 +0000 UTC m=+28.531880027" Sep 16 04:25:43.699106 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d459b4ec64e408a3879dc1a395bbfba70dafda10f7ea034e426298eadb744c7b-rootfs.mount: Deactivated successfully. Sep 16 04:25:44.630987 containerd[2008]: time="2025-09-16T04:25:44.630731134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 04:25:45.379767 kubelet[3544]: E0916 04:25:45.379364 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:47.379150 kubelet[3544]: E0916 04:25:47.379097 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:48.582377 containerd[2008]: time="2025-09-16T04:25:48.582311473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:48.584067 containerd[2008]: time="2025-09-16T04:25:48.583686577Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 16 04:25:48.585150 containerd[2008]: time="2025-09-16T04:25:48.585094201Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:48.589716 containerd[2008]: time="2025-09-16T04:25:48.589662985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:48.591010 containerd[2008]: time="2025-09-16T04:25:48.590954005Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.960117763s" Sep 16 04:25:48.591111 containerd[2008]: time="2025-09-16T04:25:48.591009001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 16 04:25:48.597609 containerd[2008]: time="2025-09-16T04:25:48.597552782Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 04:25:48.613067 containerd[2008]: time="2025-09-16T04:25:48.612995318Z" level=info msg="Container 066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:48.635493 containerd[2008]: time="2025-09-16T04:25:48.635318714Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\"" Sep 16 04:25:48.637326 containerd[2008]: time="2025-09-16T04:25:48.636221426Z" level=info msg="StartContainer for \"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\"" Sep 16 04:25:48.644083 containerd[2008]: time="2025-09-16T04:25:48.643989206Z" level=info msg="connecting to shim 066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668" address="unix:///run/containerd/s/d7d3fb393d27c594bf0192da39d757667c453a16c81d94a2a28da61015d62837" protocol=ttrpc version=3 Sep 16 04:25:48.693390 systemd[1]: Started cri-containerd-066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668.scope - libcontainer container 066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668. Sep 16 04:25:48.770909 containerd[2008]: time="2025-09-16T04:25:48.770829326Z" level=info msg="StartContainer for \"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\" returns successfully" Sep 16 04:25:49.379791 kubelet[3544]: E0916 04:25:49.379351 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:49.791998 containerd[2008]: time="2025-09-16T04:25:49.791804199Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:25:49.795973 systemd[1]: cri-containerd-066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668.scope: Deactivated successfully. Sep 16 04:25:49.797236 systemd[1]: cri-containerd-066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668.scope: Consumed 928ms CPU time, 191.6M memory peak, 165.8M written to disk. Sep 16 04:25:49.802426 containerd[2008]: time="2025-09-16T04:25:49.802240780Z" level=info msg="received exit event container_id:\"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\" id:\"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\" pid:4307 exited_at:{seconds:1757996749 nanos:801865492}" Sep 16 04:25:49.802609 containerd[2008]: time="2025-09-16T04:25:49.802454128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\" id:\"066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668\" pid:4307 exited_at:{seconds:1757996749 nanos:801865492}" Sep 16 04:25:49.844133 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-066227b255ca9242589d164aef741df9f521b97faea870969420b003684c0668-rootfs.mount: Deactivated successfully. Sep 16 04:25:49.849766 kubelet[3544]: I0916 04:25:49.849689 3544 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 16 04:25:50.001018 systemd[1]: Created slice kubepods-besteffort-poddb8ec7a3_7a10_45dd_be69_940376ffb5af.slice - libcontainer container kubepods-besteffort-poddb8ec7a3_7a10_45dd_be69_940376ffb5af.slice. Sep 16 04:25:50.075242 systemd[1]: Created slice kubepods-besteffort-pod23547634_dd5f_4479_bfc2_8034339f6a17.slice - libcontainer container kubepods-besteffort-pod23547634_dd5f_4479_bfc2_8034339f6a17.slice. Sep 16 04:25:50.117359 kubelet[3544]: I0916 04:25:50.117076 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/23547634-dd5f-4479-bfc2-8034339f6a17-calico-apiserver-certs\") pod \"calico-apiserver-65cb54b74c-f9hzv\" (UID: \"23547634-dd5f-4479-bfc2-8034339f6a17\") " pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" Sep 16 04:25:50.117359 kubelet[3544]: I0916 04:25:50.117144 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ec7a3-7a10-45dd-be69-940376ffb5af-tigera-ca-bundle\") pod \"calico-kube-controllers-8586679c6f-w2bsg\" (UID: \"db8ec7a3-7a10-45dd-be69-940376ffb5af\") " pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" Sep 16 04:25:50.117359 kubelet[3544]: I0916 04:25:50.117201 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgckq\" (UniqueName: \"kubernetes.io/projected/db8ec7a3-7a10-45dd-be69-940376ffb5af-kube-api-access-wgckq\") pod \"calico-kube-controllers-8586679c6f-w2bsg\" (UID: \"db8ec7a3-7a10-45dd-be69-940376ffb5af\") " pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" Sep 16 04:25:50.117359 kubelet[3544]: I0916 04:25:50.117253 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znm5g\" (UniqueName: \"kubernetes.io/projected/23547634-dd5f-4479-bfc2-8034339f6a17-kube-api-access-znm5g\") pod \"calico-apiserver-65cb54b74c-f9hzv\" (UID: \"23547634-dd5f-4479-bfc2-8034339f6a17\") " pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" Sep 16 04:25:50.218802 kubelet[3544]: I0916 04:25:50.218403 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959dh\" (UniqueName: \"kubernetes.io/projected/4961fb5b-738f-4ada-b144-df52d29e76d0-kube-api-access-959dh\") pod \"coredns-674b8bbfcf-8vlcr\" (UID: \"4961fb5b-738f-4ada-b144-df52d29e76d0\") " pod="kube-system/coredns-674b8bbfcf-8vlcr" Sep 16 04:25:50.218802 kubelet[3544]: I0916 04:25:50.218566 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4961fb5b-738f-4ada-b144-df52d29e76d0-config-volume\") pod \"coredns-674b8bbfcf-8vlcr\" (UID: \"4961fb5b-738f-4ada-b144-df52d29e76d0\") " pod="kube-system/coredns-674b8bbfcf-8vlcr" Sep 16 04:25:50.255071 systemd[1]: Created slice kubepods-burstable-pod4961fb5b_738f_4ada_b144_df52d29e76d0.slice - libcontainer container kubepods-burstable-pod4961fb5b_738f_4ada_b144_df52d29e76d0.slice. Sep 16 04:25:50.319924 kubelet[3544]: I0916 04:25:50.319594 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjcm\" (UniqueName: \"kubernetes.io/projected/26b1e38d-50b3-4013-876d-01494e2edccc-kube-api-access-2bjcm\") pod \"calico-apiserver-65cb54b74c-8d9jm\" (UID: \"26b1e38d-50b3-4013-876d-01494e2edccc\") " pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" Sep 16 04:25:50.323684 kubelet[3544]: I0916 04:25:50.323609 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/26b1e38d-50b3-4013-876d-01494e2edccc-calico-apiserver-certs\") pod \"calico-apiserver-65cb54b74c-8d9jm\" (UID: \"26b1e38d-50b3-4013-876d-01494e2edccc\") " pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" Sep 16 04:25:50.335504 systemd[1]: Created slice kubepods-besteffort-pod26b1e38d_50b3_4013_876d_01494e2edccc.slice - libcontainer container kubepods-besteffort-pod26b1e38d_50b3_4013_876d_01494e2edccc.slice. Sep 16 04:25:50.342538 containerd[2008]: time="2025-09-16T04:25:50.341980154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8586679c6f-w2bsg,Uid:db8ec7a3-7a10-45dd-be69-940376ffb5af,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:50.360808 systemd[1]: Created slice kubepods-burstable-poddcc0175b_736d_4e26_b89e_e1c0712792a7.slice - libcontainer container kubepods-burstable-poddcc0175b_736d_4e26_b89e_e1c0712792a7.slice. Sep 16 04:25:50.414300 systemd[1]: Created slice kubepods-besteffort-pod5fcf5cfa_5557_4c8d_99c2_4c60e125a17e.slice - libcontainer container kubepods-besteffort-pod5fcf5cfa_5557_4c8d_99c2_4c60e125a17e.slice. Sep 16 04:25:50.427059 kubelet[3544]: I0916 04:25:50.424356 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5vm\" (UniqueName: \"kubernetes.io/projected/dcc0175b-736d-4e26-b89e-e1c0712792a7-kube-api-access-fh5vm\") pod \"coredns-674b8bbfcf-dbnkd\" (UID: \"dcc0175b-736d-4e26-b89e-e1c0712792a7\") " pod="kube-system/coredns-674b8bbfcf-dbnkd" Sep 16 04:25:50.427059 kubelet[3544]: I0916 04:25:50.424448 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcc0175b-736d-4e26-b89e-e1c0712792a7-config-volume\") pod \"coredns-674b8bbfcf-dbnkd\" (UID: \"dcc0175b-736d-4e26-b89e-e1c0712792a7\") " pod="kube-system/coredns-674b8bbfcf-dbnkd" Sep 16 04:25:50.455806 containerd[2008]: time="2025-09-16T04:25:50.454850955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-f9hzv,Uid:23547634-dd5f-4479-bfc2-8034339f6a17,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:25:50.479563 systemd[1]: Created slice kubepods-besteffort-pod4ab014bc_cb62_46ec_af11_d77cb6a14351.slice - libcontainer container kubepods-besteffort-pod4ab014bc_cb62_46ec_af11_d77cb6a14351.slice. Sep 16 04:25:50.527465 kubelet[3544]: I0916 04:25:50.527381 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ab014bc-cb62-46ec-af11-d77cb6a14351-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-hlwrv\" (UID: \"4ab014bc-cb62-46ec-af11-d77cb6a14351\") " pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:50.527645 kubelet[3544]: I0916 04:25:50.527490 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srgdl\" (UniqueName: \"kubernetes.io/projected/4ab014bc-cb62-46ec-af11-d77cb6a14351-kube-api-access-srgdl\") pod \"goldmane-54d579b49d-hlwrv\" (UID: \"4ab014bc-cb62-46ec-af11-d77cb6a14351\") " pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:50.527645 kubelet[3544]: I0916 04:25:50.527592 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-backend-key-pair\") pod \"whisker-5d668b4c74-ntvhw\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " pod="calico-system/whisker-5d668b4c74-ntvhw" Sep 16 04:25:50.527780 kubelet[3544]: I0916 04:25:50.527677 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-ca-bundle\") pod \"whisker-5d668b4c74-ntvhw\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " pod="calico-system/whisker-5d668b4c74-ntvhw" Sep 16 04:25:50.528884 kubelet[3544]: I0916 04:25:50.527728 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab014bc-cb62-46ec-af11-d77cb6a14351-config\") pod \"goldmane-54d579b49d-hlwrv\" (UID: \"4ab014bc-cb62-46ec-af11-d77cb6a14351\") " pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:50.529167 kubelet[3544]: I0916 04:25:50.529135 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbnxn\" (UniqueName: \"kubernetes.io/projected/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-kube-api-access-zbnxn\") pod \"whisker-5d668b4c74-ntvhw\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " pod="calico-system/whisker-5d668b4c74-ntvhw" Sep 16 04:25:50.530416 kubelet[3544]: I0916 04:25:50.530179 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/4ab014bc-cb62-46ec-af11-d77cb6a14351-goldmane-key-pair\") pod \"goldmane-54d579b49d-hlwrv\" (UID: \"4ab014bc-cb62-46ec-af11-d77cb6a14351\") " pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:50.589079 containerd[2008]: time="2025-09-16T04:25:50.588234087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8vlcr,Uid:4961fb5b-738f-4ada-b144-df52d29e76d0,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:50.668450 containerd[2008]: time="2025-09-16T04:25:50.667969972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-8d9jm,Uid:26b1e38d-50b3-4013-876d-01494e2edccc,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:25:50.668894 containerd[2008]: time="2025-09-16T04:25:50.668615860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbnkd,Uid:dcc0175b-736d-4e26-b89e-e1c0712792a7,Namespace:kube-system,Attempt:0,}" Sep 16 04:25:50.705960 containerd[2008]: time="2025-09-16T04:25:50.705730516Z" level=error msg="Failed to destroy network for sandbox \"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.712129 containerd[2008]: time="2025-09-16T04:25:50.712029976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8586679c6f-w2bsg,Uid:db8ec7a3-7a10-45dd-be69-940376ffb5af,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.712877 kubelet[3544]: E0916 04:25:50.712313 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.712877 kubelet[3544]: E0916 04:25:50.712456 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" Sep 16 04:25:50.712877 kubelet[3544]: E0916 04:25:50.712493 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" Sep 16 04:25:50.713095 kubelet[3544]: E0916 04:25:50.712725 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8586679c6f-w2bsg_calico-system(db8ec7a3-7a10-45dd-be69-940376ffb5af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8586679c6f-w2bsg_calico-system(db8ec7a3-7a10-45dd-be69-940376ffb5af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b65a5ae1b3c84cab4ae472434e68d5414a6350a384b3e96f49ab7339ab35b6b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" podUID="db8ec7a3-7a10-45dd-be69-940376ffb5af" Sep 16 04:25:50.718235 containerd[2008]: time="2025-09-16T04:25:50.718138468Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 04:25:50.743134 containerd[2008]: time="2025-09-16T04:25:50.743058784Z" level=error msg="Failed to destroy network for sandbox \"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.747187 containerd[2008]: time="2025-09-16T04:25:50.746826952Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-f9hzv,Uid:23547634-dd5f-4479-bfc2-8034339f6a17,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.747371 kubelet[3544]: E0916 04:25:50.747227 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.747371 kubelet[3544]: E0916 04:25:50.747305 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" Sep 16 04:25:50.747371 kubelet[3544]: E0916 04:25:50.747340 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" Sep 16 04:25:50.747590 kubelet[3544]: E0916 04:25:50.747417 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-65cb54b74c-f9hzv_calico-apiserver(23547634-dd5f-4479-bfc2-8034339f6a17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-65cb54b74c-f9hzv_calico-apiserver(23547634-dd5f-4479-bfc2-8034339f6a17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"edb038c1e462c9fb066c4279748ccdd1d8ec5f409a446ced1f193316d4256818\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" podUID="23547634-dd5f-4479-bfc2-8034339f6a17" Sep 16 04:25:50.764983 containerd[2008]: time="2025-09-16T04:25:50.763000924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d668b4c74-ntvhw,Uid:5fcf5cfa-5557-4c8d-99c2-4c60e125a17e,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:50.807920 containerd[2008]: time="2025-09-16T04:25:50.807857261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hlwrv,Uid:4ab014bc-cb62-46ec-af11-d77cb6a14351,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:50.985813 containerd[2008]: time="2025-09-16T04:25:50.984069953Z" level=error msg="Failed to destroy network for sandbox \"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.990172 systemd[1]: run-netns-cni\x2d426b3c0a\x2d1c22\x2dd59a\x2d0b5d\x2d28cc34dbb156.mount: Deactivated successfully. Sep 16 04:25:50.996154 containerd[2008]: time="2025-09-16T04:25:50.996077429Z" level=error msg="Failed to destroy network for sandbox \"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.998028 containerd[2008]: time="2025-09-16T04:25:50.997934693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-8d9jm,Uid:26b1e38d-50b3-4013-876d-01494e2edccc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:50.999620 kubelet[3544]: E0916 04:25:50.999259 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.001238 kubelet[3544]: E0916 04:25:51.000956 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" Sep 16 04:25:51.001238 kubelet[3544]: E0916 04:25:51.001014 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" Sep 16 04:25:51.001238 kubelet[3544]: E0916 04:25:51.001108 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-65cb54b74c-8d9jm_calico-apiserver(26b1e38d-50b3-4013-876d-01494e2edccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-65cb54b74c-8d9jm_calico-apiserver(26b1e38d-50b3-4013-876d-01494e2edccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"059a9cd9988163974aa549ec3fd51950fe18f0fd069ac48e18476d0bfa87530c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" podUID="26b1e38d-50b3-4013-876d-01494e2edccc" Sep 16 04:25:51.004362 kubelet[3544]: E0916 04:25:51.003178 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.004362 kubelet[3544]: E0916 04:25:51.003269 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-8vlcr" Sep 16 04:25:51.004518 containerd[2008]: time="2025-09-16T04:25:51.002874097Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8vlcr,Uid:4961fb5b-738f-4ada-b144-df52d29e76d0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.002430 systemd[1]: run-netns-cni\x2d66931e76\x2d829a\x2d5d00\x2d886e\x2d3eb01e562f6f.mount: Deactivated successfully. Sep 16 04:25:51.006241 kubelet[3544]: E0916 04:25:51.005778 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-8vlcr" Sep 16 04:25:51.006241 kubelet[3544]: E0916 04:25:51.005971 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-8vlcr_kube-system(4961fb5b-738f-4ada-b144-df52d29e76d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-8vlcr_kube-system(4961fb5b-738f-4ada-b144-df52d29e76d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"376d7555638115d6d52907c69330f696830b5bdf7d3c6cdb87b61383e17ff329\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-8vlcr" podUID="4961fb5b-738f-4ada-b144-df52d29e76d0" Sep 16 04:25:51.040531 containerd[2008]: time="2025-09-16T04:25:51.040463282Z" level=error msg="Failed to destroy network for sandbox \"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.046949 containerd[2008]: time="2025-09-16T04:25:51.046764146Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbnkd,Uid:dcc0175b-736d-4e26-b89e-e1c0712792a7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.047209 kubelet[3544]: E0916 04:25:51.047147 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.047322 kubelet[3544]: E0916 04:25:51.047238 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dbnkd" Sep 16 04:25:51.047322 kubelet[3544]: E0916 04:25:51.047276 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dbnkd" Sep 16 04:25:51.047429 kubelet[3544]: E0916 04:25:51.047354 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dbnkd_kube-system(dcc0175b-736d-4e26-b89e-e1c0712792a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dbnkd_kube-system(dcc0175b-736d-4e26-b89e-e1c0712792a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"168e78d29e48fab74466edd5508859d37e5c0f5129a5ec11e9599da17d3196bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dbnkd" podUID="dcc0175b-736d-4e26-b89e-e1c0712792a7" Sep 16 04:25:51.047946 systemd[1]: run-netns-cni\x2dd59e43fd\x2d8ec6\x2d5999\x2dc982\x2dee305a23bdb8.mount: Deactivated successfully. Sep 16 04:25:51.062211 containerd[2008]: time="2025-09-16T04:25:51.061817450Z" level=error msg="Failed to destroy network for sandbox \"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.064938 containerd[2008]: time="2025-09-16T04:25:51.064869926Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d668b4c74-ntvhw,Uid:5fcf5cfa-5557-4c8d-99c2-4c60e125a17e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.066035 kubelet[3544]: E0916 04:25:51.065891 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.066785 systemd[1]: run-netns-cni\x2dd4ee1c03\x2dd246\x2d8270\x2dfccf\x2d7f4cddb162fd.mount: Deactivated successfully. Sep 16 04:25:51.067594 kubelet[3544]: E0916 04:25:51.067522 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d668b4c74-ntvhw" Sep 16 04:25:51.070182 kubelet[3544]: E0916 04:25:51.067610 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d668b4c74-ntvhw" Sep 16 04:25:51.070182 kubelet[3544]: E0916 04:25:51.067724 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5d668b4c74-ntvhw_calico-system(5fcf5cfa-5557-4c8d-99c2-4c60e125a17e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5d668b4c74-ntvhw_calico-system(5fcf5cfa-5557-4c8d-99c2-4c60e125a17e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8784e64ae5a537ebf456675300f36e9320c7b8bbd594da73b9a94934736caf1e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5d668b4c74-ntvhw" podUID="5fcf5cfa-5557-4c8d-99c2-4c60e125a17e" Sep 16 04:25:51.097955 containerd[2008]: time="2025-09-16T04:25:51.097897550Z" level=error msg="Failed to destroy network for sandbox \"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.099705 containerd[2008]: time="2025-09-16T04:25:51.099553058Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hlwrv,Uid:4ab014bc-cb62-46ec-af11-d77cb6a14351,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.100162 kubelet[3544]: E0916 04:25:51.100082 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.100269 kubelet[3544]: E0916 04:25:51.100179 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:51.100269 kubelet[3544]: E0916 04:25:51.100228 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hlwrv" Sep 16 04:25:51.100381 kubelet[3544]: E0916 04:25:51.100309 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-hlwrv_calico-system(4ab014bc-cb62-46ec-af11-d77cb6a14351)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-hlwrv_calico-system(4ab014bc-cb62-46ec-af11-d77cb6a14351)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"046516d57194f9aca13afc0ca953bc39d5c7cd72103970c6e5b21bbcc7ab1ffb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-hlwrv" podUID="4ab014bc-cb62-46ec-af11-d77cb6a14351" Sep 16 04:25:51.391934 systemd[1]: Created slice kubepods-besteffort-podfaaa3987_239e_436b_9a5d_37bf1f542a64.slice - libcontainer container kubepods-besteffort-podfaaa3987_239e_436b_9a5d_37bf1f542a64.slice. Sep 16 04:25:51.396625 containerd[2008]: time="2025-09-16T04:25:51.396553071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s2zqb,Uid:faaa3987-239e-436b-9a5d-37bf1f542a64,Namespace:calico-system,Attempt:0,}" Sep 16 04:25:51.496323 containerd[2008]: time="2025-09-16T04:25:51.496171948Z" level=error msg="Failed to destroy network for sandbox \"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.498054 containerd[2008]: time="2025-09-16T04:25:51.497894464Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s2zqb,Uid:faaa3987-239e-436b-9a5d-37bf1f542a64,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.498371 kubelet[3544]: E0916 04:25:51.498316 3544 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:25:51.500220 kubelet[3544]: E0916 04:25:51.498418 3544 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:51.500220 kubelet[3544]: E0916 04:25:51.498476 3544 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s2zqb" Sep 16 04:25:51.500220 kubelet[3544]: E0916 04:25:51.498608 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s2zqb_calico-system(faaa3987-239e-436b-9a5d-37bf1f542a64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s2zqb_calico-system(faaa3987-239e-436b-9a5d-37bf1f542a64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09fa48b7af34ab7faaeadd2f21979ccf3e7383a3e61c8e53518c7a4dc8d8670c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s2zqb" podUID="faaa3987-239e-436b-9a5d-37bf1f542a64" Sep 16 04:25:51.840871 systemd[1]: run-netns-cni\x2df91acee5\x2d59d4\x2dde8d\x2dde90\x2de88dd53a194d.mount: Deactivated successfully. Sep 16 04:25:58.687042 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3411650058.mount: Deactivated successfully. Sep 16 04:25:58.753320 containerd[2008]: time="2025-09-16T04:25:58.753231132Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:58.755314 containerd[2008]: time="2025-09-16T04:25:58.755243352Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 16 04:25:58.757980 containerd[2008]: time="2025-09-16T04:25:58.757835724Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:58.762209 containerd[2008]: time="2025-09-16T04:25:58.762114492Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:58.764190 containerd[2008]: time="2025-09-16T04:25:58.763710480Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 8.04549886s" Sep 16 04:25:58.764190 containerd[2008]: time="2025-09-16T04:25:58.764179128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 16 04:25:58.799376 containerd[2008]: time="2025-09-16T04:25:58.798987192Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 04:25:58.823595 containerd[2008]: time="2025-09-16T04:25:58.823375872Z" level=info msg="Container 4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:58.833206 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount930549959.mount: Deactivated successfully. Sep 16 04:25:58.850158 containerd[2008]: time="2025-09-16T04:25:58.850082928Z" level=info msg="CreateContainer within sandbox \"e108066c7f64eb0af58066ac6f3a1a92ef0a47c036ad4225c18aebc017d119d1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\"" Sep 16 04:25:58.853779 containerd[2008]: time="2025-09-16T04:25:58.851973984Z" level=info msg="StartContainer for \"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\"" Sep 16 04:25:58.856445 containerd[2008]: time="2025-09-16T04:25:58.856395312Z" level=info msg="connecting to shim 4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6" address="unix:///run/containerd/s/d7d3fb393d27c594bf0192da39d757667c453a16c81d94a2a28da61015d62837" protocol=ttrpc version=3 Sep 16 04:25:58.890078 systemd[1]: Started cri-containerd-4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6.scope - libcontainer container 4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6. Sep 16 04:25:58.983214 containerd[2008]: time="2025-09-16T04:25:58.982831177Z" level=info msg="StartContainer for \"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" returns successfully" Sep 16 04:25:59.247364 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 04:25:59.247511 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 04:25:59.608508 kubelet[3544]: I0916 04:25:59.608437 3544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-backend-key-pair\") pod \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " Sep 16 04:25:59.609167 kubelet[3544]: I0916 04:25:59.608558 3544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-ca-bundle\") pod \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " Sep 16 04:25:59.609167 kubelet[3544]: I0916 04:25:59.608611 3544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbnxn\" (UniqueName: \"kubernetes.io/projected/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-kube-api-access-zbnxn\") pod \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\" (UID: \"5fcf5cfa-5557-4c8d-99c2-4c60e125a17e\") " Sep 16 04:25:59.616885 kubelet[3544]: I0916 04:25:59.616530 3544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-kube-api-access-zbnxn" (OuterVolumeSpecName: "kube-api-access-zbnxn") pod "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e" (UID: "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e"). InnerVolumeSpecName "kube-api-access-zbnxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 04:25:59.617464 kubelet[3544]: I0916 04:25:59.617411 3544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e" (UID: "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 16 04:25:59.618020 kubelet[3544]: I0916 04:25:59.617942 3544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e" (UID: "5fcf5cfa-5557-4c8d-99c2-4c60e125a17e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 04:25:59.695301 systemd[1]: var-lib-kubelet-pods-5fcf5cfa\x2d5557\x2d4c8d\x2d99c2\x2d4c60e125a17e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzbnxn.mount: Deactivated successfully. Sep 16 04:25:59.695589 systemd[1]: var-lib-kubelet-pods-5fcf5cfa\x2d5557\x2d4c8d\x2d99c2\x2d4c60e125a17e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 04:25:59.709573 kubelet[3544]: I0916 04:25:59.709420 3544 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-backend-key-pair\") on node \"ip-172-31-31-172\" DevicePath \"\"" Sep 16 04:25:59.709845 kubelet[3544]: I0916 04:25:59.709530 3544 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-whisker-ca-bundle\") on node \"ip-172-31-31-172\" DevicePath \"\"" Sep 16 04:25:59.709845 kubelet[3544]: I0916 04:25:59.709807 3544 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbnxn\" (UniqueName: \"kubernetes.io/projected/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e-kube-api-access-zbnxn\") on node \"ip-172-31-31-172\" DevicePath \"\"" Sep 16 04:25:59.790054 systemd[1]: Removed slice kubepods-besteffort-pod5fcf5cfa_5557_4c8d_99c2_4c60e125a17e.slice - libcontainer container kubepods-besteffort-pod5fcf5cfa_5557_4c8d_99c2_4c60e125a17e.slice. Sep 16 04:25:59.855765 kubelet[3544]: I0916 04:25:59.855429 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fn8lj" podStartSLOduration=2.823260378 podStartE2EDuration="21.855398833s" podCreationTimestamp="2025-09-16 04:25:38 +0000 UTC" firstStartedPulling="2025-09-16 04:25:39.733085945 +0000 UTC m=+24.600729699" lastFinishedPulling="2025-09-16 04:25:58.7652244 +0000 UTC m=+43.632868154" observedRunningTime="2025-09-16 04:25:59.821850757 +0000 UTC m=+44.689494595" watchObservedRunningTime="2025-09-16 04:25:59.855398833 +0000 UTC m=+44.723042587" Sep 16 04:25:59.952861 kubelet[3544]: E0916 04:25:59.952550 3544 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ip-172-31-31-172\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-31-172' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"whisker-backend-key-pair\"" type="*v1.Secret" Sep 16 04:25:59.963857 systemd[1]: Created slice kubepods-besteffort-podfe5b551d_6e18_4bba_b116_0a6af6b9b977.slice - libcontainer container kubepods-besteffort-podfe5b551d_6e18_4bba_b116_0a6af6b9b977.slice. Sep 16 04:26:00.114647 kubelet[3544]: I0916 04:26:00.114411 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fe5b551d-6e18-4bba-b116-0a6af6b9b977-whisker-backend-key-pair\") pod \"whisker-58c66ff65c-dznkv\" (UID: \"fe5b551d-6e18-4bba-b116-0a6af6b9b977\") " pod="calico-system/whisker-58c66ff65c-dznkv" Sep 16 04:26:00.114647 kubelet[3544]: I0916 04:26:00.114497 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe5b551d-6e18-4bba-b116-0a6af6b9b977-whisker-ca-bundle\") pod \"whisker-58c66ff65c-dznkv\" (UID: \"fe5b551d-6e18-4bba-b116-0a6af6b9b977\") " pod="calico-system/whisker-58c66ff65c-dznkv" Sep 16 04:26:00.114647 kubelet[3544]: I0916 04:26:00.114546 3544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwczd\" (UniqueName: \"kubernetes.io/projected/fe5b551d-6e18-4bba-b116-0a6af6b9b977-kube-api-access-qwczd\") pod \"whisker-58c66ff65c-dznkv\" (UID: \"fe5b551d-6e18-4bba-b116-0a6af6b9b977\") " pod="calico-system/whisker-58c66ff65c-dznkv" Sep 16 04:26:00.514013 containerd[2008]: time="2025-09-16T04:26:00.513924313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"80f4e535464b571e32cae206d2014b491e5fa20f10a1e351a4308b53105f55d6\" pid:4627 exit_status:1 exited_at:{seconds:1757996760 nanos:513182161}" Sep 16 04:26:00.707402 kubelet[3544]: I0916 04:26:00.707149 3544 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:26:01.090643 containerd[2008]: time="2025-09-16T04:26:01.090567648Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"b609bab45f165f9f1f9a800571980c6247a56ffcc36d94b476ed319648a867cb\" pid:4759 exit_status:1 exited_at:{seconds:1757996761 nanos:89616084}" Sep 16 04:26:01.386470 kubelet[3544]: I0916 04:26:01.386196 3544 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fcf5cfa-5557-4c8d-99c2-4c60e125a17e" path="/var/lib/kubelet/pods/5fcf5cfa-5557-4c8d-99c2-4c60e125a17e/volumes" Sep 16 04:26:01.451624 (udev-worker)[4600]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:26:01.464281 systemd-networkd[1820]: vxlan.calico: Link UP Sep 16 04:26:01.464295 systemd-networkd[1820]: vxlan.calico: Gained carrier Sep 16 04:26:01.472022 containerd[2008]: time="2025-09-16T04:26:01.471447097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58c66ff65c-dznkv,Uid:fe5b551d-6e18-4bba-b116-0a6af6b9b977,Namespace:calico-system,Attempt:0,}" Sep 16 04:26:01.533515 (udev-worker)[4601]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:26:01.938621 (udev-worker)[4827]: Network interface NamePolicy= disabled on kernel command line. Sep 16 04:26:01.943242 systemd-networkd[1820]: cali33369db9781: Link UP Sep 16 04:26:01.947192 systemd-networkd[1820]: cali33369db9781: Gained carrier Sep 16 04:26:01.985163 containerd[2008]: 2025-09-16 04:26:01.688 [INFO][4813] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0 whisker-58c66ff65c- calico-system fe5b551d-6e18-4bba-b116-0a6af6b9b977 934 0 2025-09-16 04:25:59 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58c66ff65c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-31-172 whisker-58c66ff65c-dznkv eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali33369db9781 [] [] }} ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-" Sep 16 04:26:01.985163 containerd[2008]: 2025-09-16 04:26:01.689 [INFO][4813] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.985163 containerd[2008]: 2025-09-16 04:26:01.824 [INFO][4836] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" HandleID="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Workload="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.825 [INFO][4836] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" HandleID="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Workload="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000123aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-172", "pod":"whisker-58c66ff65c-dznkv", "timestamp":"2025-09-16 04:26:01.824954151 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.825 [INFO][4836] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.825 [INFO][4836] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.825 [INFO][4836] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.855 [INFO][4836] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" host="ip-172-31-31-172" Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.871 [INFO][4836] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.883 [INFO][4836] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.888 [INFO][4836] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:01.988106 containerd[2008]: 2025-09-16 04:26:01.893 [INFO][4836] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.893 [INFO][4836] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" host="ip-172-31-31-172" Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.897 [INFO][4836] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.906 [INFO][4836] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" host="ip-172-31-31-172" Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.918 [INFO][4836] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.1/26] block=192.168.121.0/26 handle="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" host="ip-172-31-31-172" Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.918 [INFO][4836] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.1/26] handle="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" host="ip-172-31-31-172" Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.918 [INFO][4836] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:01.988564 containerd[2008]: 2025-09-16 04:26:01.918 [INFO][4836] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.1/26] IPv6=[] ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" HandleID="k8s-pod-network.8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Workload="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.990905 containerd[2008]: 2025-09-16 04:26:01.930 [INFO][4813] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0", GenerateName:"whisker-58c66ff65c-", Namespace:"calico-system", SelfLink:"", UID:"fe5b551d-6e18-4bba-b116-0a6af6b9b977", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58c66ff65c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"whisker-58c66ff65c-dznkv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.121.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali33369db9781", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:01.990905 containerd[2008]: 2025-09-16 04:26:01.930 [INFO][4813] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.1/32] ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.991972 containerd[2008]: 2025-09-16 04:26:01.931 [INFO][4813] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33369db9781 ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.991972 containerd[2008]: 2025-09-16 04:26:01.950 [INFO][4813] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:01.992409 containerd[2008]: 2025-09-16 04:26:01.950 [INFO][4813] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0", GenerateName:"whisker-58c66ff65c-", Namespace:"calico-system", SelfLink:"", UID:"fe5b551d-6e18-4bba-b116-0a6af6b9b977", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58c66ff65c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca", Pod:"whisker-58c66ff65c-dznkv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.121.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali33369db9781", MAC:"1a:10:85:45:fe:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:01.993323 containerd[2008]: 2025-09-16 04:26:01.976 [INFO][4813] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" Namespace="calico-system" Pod="whisker-58c66ff65c-dznkv" WorkloadEndpoint="ip--172--31--31--172-k8s-whisker--58c66ff65c--dznkv-eth0" Sep 16 04:26:02.060562 containerd[2008]: time="2025-09-16T04:26:02.060483888Z" level=info msg="connecting to shim 8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca" address="unix:///run/containerd/s/cb0a71c1f31d843e676616859d4469e1bc17fbc7ffa1d0a52e1713ab6499c5d6" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:02.151201 systemd[1]: Started cri-containerd-8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca.scope - libcontainer container 8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca. Sep 16 04:26:02.333817 containerd[2008]: time="2025-09-16T04:26:02.333674954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58c66ff65c-dznkv,Uid:fe5b551d-6e18-4bba-b116-0a6af6b9b977,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca\"" Sep 16 04:26:02.339641 containerd[2008]: time="2025-09-16T04:26:02.339576818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 04:26:02.377653 containerd[2008]: time="2025-09-16T04:26:02.377581190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"a0129e7fd87e055edd75bd9d2c6d4a91a59416e70cd751a085a0cd72e28c2b5d\" pid:4856 exit_status:1 exited_at:{seconds:1757996762 nanos:376808246}" Sep 16 04:26:02.379986 containerd[2008]: time="2025-09-16T04:26:02.379903838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbnkd,Uid:dcc0175b-736d-4e26-b89e-e1c0712792a7,Namespace:kube-system,Attempt:0,}" Sep 16 04:26:02.381339 containerd[2008]: time="2025-09-16T04:26:02.380986850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-8d9jm,Uid:26b1e38d-50b3-4013-876d-01494e2edccc,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:26:02.866202 systemd-networkd[1820]: cali47778079837: Link UP Sep 16 04:26:02.870193 systemd-networkd[1820]: cali47778079837: Gained carrier Sep 16 04:26:02.916636 containerd[2008]: 2025-09-16 04:26:02.634 [INFO][4969] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0 coredns-674b8bbfcf- kube-system dcc0175b-736d-4e26-b89e-e1c0712792a7 866 0 2025-09-16 04:25:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-31-172 coredns-674b8bbfcf-dbnkd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali47778079837 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-" Sep 16 04:26:02.916636 containerd[2008]: 2025-09-16 04:26:02.635 [INFO][4969] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.916636 containerd[2008]: 2025-09-16 04:26:02.741 [INFO][5002] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" HandleID="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.744 [INFO][5002] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" HandleID="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000365940), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-31-172", "pod":"coredns-674b8bbfcf-dbnkd", "timestamp":"2025-09-16 04:26:02.741359668 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.744 [INFO][5002] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.744 [INFO][5002] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.744 [INFO][5002] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.792 [INFO][5002] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" host="ip-172-31-31-172" Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.804 [INFO][5002] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.814 [INFO][5002] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.817 [INFO][5002] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:02.916979 containerd[2008]: 2025-09-16 04:26:02.822 [INFO][5002] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.822 [INFO][5002] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" host="ip-172-31-31-172" Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.825 [INFO][5002] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1 Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.836 [INFO][5002] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" host="ip-172-31-31-172" Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.849 [INFO][5002] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.2/26] block=192.168.121.0/26 handle="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" host="ip-172-31-31-172" Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.850 [INFO][5002] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.2/26] handle="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" host="ip-172-31-31-172" Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.850 [INFO][5002] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:02.918722 containerd[2008]: 2025-09-16 04:26:02.851 [INFO][5002] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.2/26] IPv6=[] ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" HandleID="k8s-pod-network.81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.857 [INFO][4969] cni-plugin/k8s.go 418: Populated endpoint ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"dcc0175b-736d-4e26-b89e-e1c0712792a7", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"coredns-674b8bbfcf-dbnkd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali47778079837", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.858 [INFO][4969] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.2/32] ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.858 [INFO][4969] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali47778079837 ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.871 [INFO][4969] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.874 [INFO][4969] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"dcc0175b-736d-4e26-b89e-e1c0712792a7", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1", Pod:"coredns-674b8bbfcf-dbnkd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali47778079837", MAC:"c6:fb:0d:92:53:d0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:02.919360 containerd[2008]: 2025-09-16 04:26:02.911 [INFO][4969] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" Namespace="kube-system" Pod="coredns-674b8bbfcf-dbnkd" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--dbnkd-eth0" Sep 16 04:26:02.998698 containerd[2008]: time="2025-09-16T04:26:02.998642021Z" level=info msg="connecting to shim 81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1" address="unix:///run/containerd/s/de343060d76cb4f7efe86683e38a8f93c8171bf8d9d70f311e613179f126c766" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:03.020446 systemd-networkd[1820]: cali18c92766682: Link UP Sep 16 04:26:03.030258 systemd-networkd[1820]: cali18c92766682: Gained carrier Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.630 [INFO][4970] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0 calico-apiserver-65cb54b74c- calico-apiserver 26b1e38d-50b3-4013-876d-01494e2edccc 865 0 2025-09-16 04:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:65cb54b74c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-31-172 calico-apiserver-65cb54b74c-8d9jm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali18c92766682 [] [] }} ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.630 [INFO][4970] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.767 [INFO][5000] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" HandleID="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.774 [INFO][5000] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" HandleID="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400038a9a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-31-172", "pod":"calico-apiserver-65cb54b74c-8d9jm", "timestamp":"2025-09-16 04:26:02.767476096 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.775 [INFO][5000] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.851 [INFO][5000] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.851 [INFO][5000] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.902 [INFO][5000] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.922 [INFO][5000] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.935 [INFO][5000] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.939 [INFO][5000] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.945 [INFO][5000] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.945 [INFO][5000] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.954 [INFO][5000] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.971 [INFO][5000] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.991 [INFO][5000] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.3/26] block=192.168.121.0/26 handle="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.992 [INFO][5000] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.3/26] handle="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" host="ip-172-31-31-172" Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.992 [INFO][5000] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:03.091385 containerd[2008]: 2025-09-16 04:26:02.992 [INFO][5000] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.3/26] IPv6=[] ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" HandleID="k8s-pod-network.5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.005 [INFO][4970] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0", GenerateName:"calico-apiserver-65cb54b74c-", Namespace:"calico-apiserver", SelfLink:"", UID:"26b1e38d-50b3-4013-876d-01494e2edccc", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65cb54b74c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"calico-apiserver-65cb54b74c-8d9jm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali18c92766682", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.005 [INFO][4970] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.3/32] ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.006 [INFO][4970] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali18c92766682 ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.034 [INFO][4970] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.040 [INFO][4970] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0", GenerateName:"calico-apiserver-65cb54b74c-", Namespace:"calico-apiserver", SelfLink:"", UID:"26b1e38d-50b3-4013-876d-01494e2edccc", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65cb54b74c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc", Pod:"calico-apiserver-65cb54b74c-8d9jm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali18c92766682", MAC:"e2:2e:06:c0:20:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:03.094591 containerd[2008]: 2025-09-16 04:26:03.064 [INFO][4970] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-8d9jm" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--8d9jm-eth0" Sep 16 04:26:03.105995 systemd[1]: Started cri-containerd-81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1.scope - libcontainer container 81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1. Sep 16 04:26:03.140981 systemd-networkd[1820]: vxlan.calico: Gained IPv6LL Sep 16 04:26:03.176515 containerd[2008]: time="2025-09-16T04:26:03.176412446Z" level=info msg="connecting to shim 5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc" address="unix:///run/containerd/s/473252df85f7cf366eb6bc641ac77284715c10b9d11eee42f83429ba2edbc84c" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:03.252079 systemd[1]: Started cri-containerd-5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc.scope - libcontainer container 5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc. Sep 16 04:26:03.274974 containerd[2008]: time="2025-09-16T04:26:03.274822310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dbnkd,Uid:dcc0175b-736d-4e26-b89e-e1c0712792a7,Namespace:kube-system,Attempt:0,} returns sandbox id \"81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1\"" Sep 16 04:26:03.292975 containerd[2008]: time="2025-09-16T04:26:03.292905591Z" level=info msg="CreateContainer within sandbox \"81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:26:03.317082 containerd[2008]: time="2025-09-16T04:26:03.317016747Z" level=info msg="Container e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:03.339772 containerd[2008]: time="2025-09-16T04:26:03.339614619Z" level=info msg="CreateContainer within sandbox \"81d4abf5fcb1fb050e3ca10aefafecd6a966b29a6481050009e7b9099570a5b1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a\"" Sep 16 04:26:03.343276 containerd[2008]: time="2025-09-16T04:26:03.342864495Z" level=info msg="StartContainer for \"e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a\"" Sep 16 04:26:03.346852 containerd[2008]: time="2025-09-16T04:26:03.346631235Z" level=info msg="connecting to shim e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a" address="unix:///run/containerd/s/de343060d76cb4f7efe86683e38a8f93c8171bf8d9d70f311e613179f126c766" protocol=ttrpc version=3 Sep 16 04:26:03.365340 containerd[2008]: time="2025-09-16T04:26:03.365275575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-8d9jm,Uid:26b1e38d-50b3-4013-876d-01494e2edccc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc\"" Sep 16 04:26:03.381847 containerd[2008]: time="2025-09-16T04:26:03.381680295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8586679c6f-w2bsg,Uid:db8ec7a3-7a10-45dd-be69-940376ffb5af,Namespace:calico-system,Attempt:0,}" Sep 16 04:26:03.443148 systemd[1]: Started cri-containerd-e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a.scope - libcontainer container e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a. Sep 16 04:26:03.461583 systemd-networkd[1820]: cali33369db9781: Gained IPv6LL Sep 16 04:26:03.586578 containerd[2008]: time="2025-09-16T04:26:03.586494028Z" level=info msg="StartContainer for \"e8c36c8f56c99fc57eb4738d011f694f9ef2eec3e043823afc6c447097746a5a\" returns successfully" Sep 16 04:26:03.807833 systemd-networkd[1820]: cali0a35d63087f: Link UP Sep 16 04:26:03.810510 systemd-networkd[1820]: cali0a35d63087f: Gained carrier Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.537 [INFO][5143] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0 calico-kube-controllers-8586679c6f- calico-system db8ec7a3-7a10-45dd-be69-940376ffb5af 862 0 2025-09-16 04:25:39 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8586679c6f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-31-172 calico-kube-controllers-8586679c6f-w2bsg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0a35d63087f [] [] }} ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.538 [INFO][5143] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.646 [INFO][5172] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" HandleID="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Workload="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.647 [INFO][5172] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" HandleID="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Workload="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003660d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-172", "pod":"calico-kube-controllers-8586679c6f-w2bsg", "timestamp":"2025-09-16 04:26:03.64691692 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.647 [INFO][5172] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.647 [INFO][5172] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.647 [INFO][5172] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.677 [INFO][5172] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.688 [INFO][5172] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.701 [INFO][5172] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.709 [INFO][5172] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.719 [INFO][5172] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.721 [INFO][5172] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.729 [INFO][5172] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.747 [INFO][5172] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.769 [INFO][5172] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.4/26] block=192.168.121.0/26 handle="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.769 [INFO][5172] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.4/26] handle="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" host="ip-172-31-31-172" Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.769 [INFO][5172] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:03.903043 containerd[2008]: 2025-09-16 04:26:03.769 [INFO][5172] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.4/26] IPv6=[] ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" HandleID="k8s-pod-network.d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Workload="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.783 [INFO][5143] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0", GenerateName:"calico-kube-controllers-8586679c6f-", Namespace:"calico-system", SelfLink:"", UID:"db8ec7a3-7a10-45dd-be69-940376ffb5af", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8586679c6f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"calico-kube-controllers-8586679c6f-w2bsg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0a35d63087f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.783 [INFO][5143] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.4/32] ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.783 [INFO][5143] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0a35d63087f ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.810 [INFO][5143] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.813 [INFO][5143] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0", GenerateName:"calico-kube-controllers-8586679c6f-", Namespace:"calico-system", SelfLink:"", UID:"db8ec7a3-7a10-45dd-be69-940376ffb5af", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8586679c6f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca", Pod:"calico-kube-controllers-8586679c6f-w2bsg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0a35d63087f", MAC:"9a:63:70:3d:59:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:03.906967 containerd[2008]: 2025-09-16 04:26:03.894 [INFO][5143] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" Namespace="calico-system" Pod="calico-kube-controllers-8586679c6f-w2bsg" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--kube--controllers--8586679c6f--w2bsg-eth0" Sep 16 04:26:03.915368 containerd[2008]: time="2025-09-16T04:26:03.915006606Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:03.917416 containerd[2008]: time="2025-09-16T04:26:03.917344446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 16 04:26:03.920567 containerd[2008]: time="2025-09-16T04:26:03.920478618Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:03.930706 kubelet[3544]: I0916 04:26:03.930463 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dbnkd" podStartSLOduration=47.93043731 podStartE2EDuration="47.93043731s" podCreationTimestamp="2025-09-16 04:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:26:03.924385626 +0000 UTC m=+48.792029380" watchObservedRunningTime="2025-09-16 04:26:03.93043731 +0000 UTC m=+48.798081064" Sep 16 04:26:03.932260 containerd[2008]: time="2025-09-16T04:26:03.931385718Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:03.939438 containerd[2008]: time="2025-09-16T04:26:03.939356718Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.599710936s" Sep 16 04:26:03.939438 containerd[2008]: time="2025-09-16T04:26:03.939423474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 16 04:26:03.946184 containerd[2008]: time="2025-09-16T04:26:03.946138614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:26:03.968147 containerd[2008]: time="2025-09-16T04:26:03.967656378Z" level=info msg="CreateContainer within sandbox \"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 04:26:03.992800 containerd[2008]: time="2025-09-16T04:26:03.992651514Z" level=info msg="connecting to shim d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca" address="unix:///run/containerd/s/1256648111d8ce679a8109afdb6cd469e05954e9017d5e83e077e76fac705dfd" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:04.021768 containerd[2008]: time="2025-09-16T04:26:04.019410458Z" level=info msg="Container a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:04.036570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2187879180.mount: Deactivated successfully. Sep 16 04:26:04.073637 containerd[2008]: time="2025-09-16T04:26:04.072815930Z" level=info msg="CreateContainer within sandbox \"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34\"" Sep 16 04:26:04.077975 containerd[2008]: time="2025-09-16T04:26:04.077904650Z" level=info msg="StartContainer for \"a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34\"" Sep 16 04:26:04.085491 containerd[2008]: time="2025-09-16T04:26:04.085408022Z" level=info msg="connecting to shim a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34" address="unix:///run/containerd/s/cb0a71c1f31d843e676616859d4469e1bc17fbc7ffa1d0a52e1713ab6499c5d6" protocol=ttrpc version=3 Sep 16 04:26:04.132265 systemd[1]: Started cri-containerd-d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca.scope - libcontainer container d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca. Sep 16 04:26:04.167277 systemd-networkd[1820]: cali47778079837: Gained IPv6LL Sep 16 04:26:04.187438 systemd[1]: Started cri-containerd-a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34.scope - libcontainer container a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34. Sep 16 04:26:04.373848 containerd[2008]: time="2025-09-16T04:26:04.373650436Z" level=info msg="StartContainer for \"a24c7940fca8d401bbf66168b608185cfa38e3a3ee19427db1b465cf4ee22c34\" returns successfully" Sep 16 04:26:04.380622 containerd[2008]: time="2025-09-16T04:26:04.380543488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hlwrv,Uid:4ab014bc-cb62-46ec-af11-d77cb6a14351,Namespace:calico-system,Attempt:0,}" Sep 16 04:26:04.386658 containerd[2008]: time="2025-09-16T04:26:04.386054512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-f9hzv,Uid:23547634-dd5f-4479-bfc2-8034339f6a17,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:26:04.523217 containerd[2008]: time="2025-09-16T04:26:04.523048937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8586679c6f-w2bsg,Uid:db8ec7a3-7a10-45dd-be69-940376ffb5af,Namespace:calico-system,Attempt:0,} returns sandbox id \"d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca\"" Sep 16 04:26:04.769527 systemd-networkd[1820]: calif20c748e23f: Link UP Sep 16 04:26:04.772348 systemd-networkd[1820]: calif20c748e23f: Gained carrier Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.593 [INFO][5267] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0 goldmane-54d579b49d- calico-system 4ab014bc-cb62-46ec-af11-d77cb6a14351 868 0 2025-09-16 04:25:38 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-31-172 goldmane-54d579b49d-hlwrv eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif20c748e23f [] [] }} ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.594 [INFO][5267] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.683 [INFO][5303] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" HandleID="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Workload="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.683 [INFO][5303] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" HandleID="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Workload="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330140), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-172", "pod":"goldmane-54d579b49d-hlwrv", "timestamp":"2025-09-16 04:26:04.683130821 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.683 [INFO][5303] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.683 [INFO][5303] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.683 [INFO][5303] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.701 [INFO][5303] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.709 [INFO][5303] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.717 [INFO][5303] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.721 [INFO][5303] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.726 [INFO][5303] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.726 [INFO][5303] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.730 [INFO][5303] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.738 [INFO][5303] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.752 [INFO][5303] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.5/26] block=192.168.121.0/26 handle="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.752 [INFO][5303] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.5/26] handle="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" host="ip-172-31-31-172" Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.753 [INFO][5303] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:04.831574 containerd[2008]: 2025-09-16 04:26:04.753 [INFO][5303] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.5/26] IPv6=[] ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" HandleID="k8s-pod-network.4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Workload="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.759 [INFO][5267] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"4ab014bc-cb62-46ec-af11-d77cb6a14351", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"goldmane-54d579b49d-hlwrv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif20c748e23f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.760 [INFO][5267] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.5/32] ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.760 [INFO][5267] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif20c748e23f ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.777 [INFO][5267] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.780 [INFO][5267] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"4ab014bc-cb62-46ec-af11-d77cb6a14351", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d", Pod:"goldmane-54d579b49d-hlwrv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif20c748e23f", MAC:"8e:f3:ab:ba:22:63", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:04.835150 containerd[2008]: 2025-09-16 04:26:04.823 [INFO][5267] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" Namespace="calico-system" Pod="goldmane-54d579b49d-hlwrv" WorkloadEndpoint="ip--172--31--31--172-k8s-goldmane--54d579b49d--hlwrv-eth0" Sep 16 04:26:04.869133 systemd-networkd[1820]: cali18c92766682: Gained IPv6LL Sep 16 04:26:04.934393 containerd[2008]: time="2025-09-16T04:26:04.934164667Z" level=info msg="connecting to shim 4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d" address="unix:///run/containerd/s/4b27a8ccc7281b0c7df23f6f835c1bb6f7bd6839d3df80b130d486e22c707a1c" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:05.002179 systemd-networkd[1820]: calia56a664c1e1: Link UP Sep 16 04:26:05.005492 systemd-networkd[1820]: calia56a664c1e1: Gained carrier Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.603 [INFO][5283] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0 calico-apiserver-65cb54b74c- calico-apiserver 23547634-dd5f-4479-bfc2-8034339f6a17 863 0 2025-09-16 04:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:65cb54b74c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-31-172 calico-apiserver-65cb54b74c-f9hzv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia56a664c1e1 [] [] }} ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.604 [INFO][5283] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.689 [INFO][5309] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" HandleID="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.689 [INFO][5309] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" HandleID="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000366040), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-31-172", "pod":"calico-apiserver-65cb54b74c-f9hzv", "timestamp":"2025-09-16 04:26:04.689444873 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.689 [INFO][5309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.753 [INFO][5309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.753 [INFO][5309] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.810 [INFO][5309] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.825 [INFO][5309] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.851 [INFO][5309] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.860 [INFO][5309] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.878 [INFO][5309] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.879 [INFO][5309] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.885 [INFO][5309] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.919 [INFO][5309] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.950 [INFO][5309] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.6/26] block=192.168.121.0/26 handle="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.950 [INFO][5309] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.6/26] handle="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" host="ip-172-31-31-172" Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.950 [INFO][5309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:05.056425 containerd[2008]: 2025-09-16 04:26:04.950 [INFO][5309] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.6/26] IPv6=[] ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" HandleID="k8s-pod-network.7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Workload="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:04.964 [INFO][5283] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0", GenerateName:"calico-apiserver-65cb54b74c-", Namespace:"calico-apiserver", SelfLink:"", UID:"23547634-dd5f-4479-bfc2-8034339f6a17", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65cb54b74c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"calico-apiserver-65cb54b74c-f9hzv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia56a664c1e1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:04.973 [INFO][5283] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.6/32] ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:04.974 [INFO][5283] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia56a664c1e1 ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:05.009 [INFO][5283] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:05.012 [INFO][5283] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0", GenerateName:"calico-apiserver-65cb54b74c-", Namespace:"calico-apiserver", SelfLink:"", UID:"23547634-dd5f-4479-bfc2-8034339f6a17", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65cb54b74c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f", Pod:"calico-apiserver-65cb54b74c-f9hzv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia56a664c1e1", MAC:"4e:59:ae:27:e3:4d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:05.060270 containerd[2008]: 2025-09-16 04:26:05.049 [INFO][5283] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" Namespace="calico-apiserver" Pod="calico-apiserver-65cb54b74c-f9hzv" WorkloadEndpoint="ip--172--31--31--172-k8s-calico--apiserver--65cb54b74c--f9hzv-eth0" Sep 16 04:26:05.062343 systemd[1]: Started cri-containerd-4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d.scope - libcontainer container 4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d. Sep 16 04:26:05.187667 containerd[2008]: time="2025-09-16T04:26:05.187599028Z" level=info msg="connecting to shim 7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f" address="unix:///run/containerd/s/ab36b42161c22917ea68812d860bbac87b6f0c201fdbf93bec97c0a45237e432" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:05.284062 systemd[1]: Started cri-containerd-7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f.scope - libcontainer container 7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f. Sep 16 04:26:05.393278 containerd[2008]: time="2025-09-16T04:26:05.393156737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8vlcr,Uid:4961fb5b-738f-4ada-b144-df52d29e76d0,Namespace:kube-system,Attempt:0,}" Sep 16 04:26:05.395183 containerd[2008]: time="2025-09-16T04:26:05.395101073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s2zqb,Uid:faaa3987-239e-436b-9a5d-37bf1f542a64,Namespace:calico-system,Attempt:0,}" Sep 16 04:26:05.396544 containerd[2008]: time="2025-09-16T04:26:05.396471653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hlwrv,Uid:4ab014bc-cb62-46ec-af11-d77cb6a14351,Namespace:calico-system,Attempt:0,} returns sandbox id \"4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d\"" Sep 16 04:26:05.831468 systemd-networkd[1820]: cali0a35d63087f: Gained IPv6LL Sep 16 04:26:06.086159 systemd[1]: Started sshd@9-172.31.31.172:22-147.75.109.163:38842.service - OpenSSH per-connection server daemon (147.75.109.163:38842). Sep 16 04:26:06.188532 containerd[2008]: time="2025-09-16T04:26:06.188447213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65cb54b74c-f9hzv,Uid:23547634-dd5f-4479-bfc2-8034339f6a17,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f\"" Sep 16 04:26:06.304330 systemd-networkd[1820]: cali8c24896254e: Link UP Sep 16 04:26:06.308488 systemd-networkd[1820]: cali8c24896254e: Gained carrier Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.683 [INFO][5423] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0 csi-node-driver- calico-system faaa3987-239e-436b-9a5d-37bf1f542a64 734 0 2025-09-16 04:25:39 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-31-172 csi-node-driver-s2zqb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8c24896254e [] [] }} ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.686 [INFO][5423] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.956 [INFO][5456] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" HandleID="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Workload="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.956 [INFO][5456] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" HandleID="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Workload="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000315920), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-172", "pod":"csi-node-driver-s2zqb", "timestamp":"2025-09-16 04:26:05.955992992 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.956 [INFO][5456] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.956 [INFO][5456] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:05.956 [INFO][5456] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.049 [INFO][5456] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.085 [INFO][5456] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.131 [INFO][5456] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.148 [INFO][5456] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.192 [INFO][5456] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.193 [INFO][5456] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.202 [INFO][5456] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4 Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.226 [INFO][5456] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.262 [INFO][5456] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.7/26] block=192.168.121.0/26 handle="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.262 [INFO][5456] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.7/26] handle="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" host="ip-172-31-31-172" Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.266 [INFO][5456] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:06.376027 containerd[2008]: 2025-09-16 04:26:06.268 [INFO][5456] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.7/26] IPv6=[] ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" HandleID="k8s-pod-network.daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Workload="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.288 [INFO][5423] cni-plugin/k8s.go 418: Populated endpoint ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"faaa3987-239e-436b-9a5d-37bf1f542a64", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"csi-node-driver-s2zqb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8c24896254e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.289 [INFO][5423] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.7/32] ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.289 [INFO][5423] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8c24896254e ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.312 [INFO][5423] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.315 [INFO][5423] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"faaa3987-239e-436b-9a5d-37bf1f542a64", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4", Pod:"csi-node-driver-s2zqb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8c24896254e", MAC:"7a:a6:11:bc:50:c5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:06.378579 containerd[2008]: 2025-09-16 04:26:06.362 [INFO][5423] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" Namespace="calico-system" Pod="csi-node-driver-s2zqb" WorkloadEndpoint="ip--172--31--31--172-k8s-csi--node--driver--s2zqb-eth0" Sep 16 04:26:06.394807 sshd[5477]: Accepted publickey for core from 147.75.109.163 port 38842 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:06.401641 sshd-session[5477]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:06.421680 systemd-logind[1971]: New session 10 of user core. Sep 16 04:26:06.425402 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 04:26:06.532707 containerd[2008]: time="2025-09-16T04:26:06.532573291Z" level=info msg="connecting to shim daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4" address="unix:///run/containerd/s/1cac349f69a88d044b56da0854499220f6b4615498b23c01227e61af86898036" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:06.599168 systemd-networkd[1820]: cali85b6cd8f672: Link UP Sep 16 04:26:06.616860 systemd-networkd[1820]: cali85b6cd8f672: Gained carrier Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:05.734 [INFO][5424] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0 coredns-674b8bbfcf- kube-system 4961fb5b-738f-4ada-b144-df52d29e76d0 864 0 2025-09-16 04:25:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-31-172 coredns-674b8bbfcf-8vlcr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali85b6cd8f672 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:05.735 [INFO][5424] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:05.983 [INFO][5461] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" HandleID="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:05.984 [INFO][5461] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" HandleID="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103e00), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-31-172", "pod":"coredns-674b8bbfcf-8vlcr", "timestamp":"2025-09-16 04:26:05.983698052 +0000 UTC"}, Hostname:"ip-172-31-31-172", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:05.985 [INFO][5461] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.264 [INFO][5461] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.268 [INFO][5461] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-172' Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.337 [INFO][5461] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.368 [INFO][5461] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.405 [INFO][5461] ipam/ipam.go 511: Trying affinity for 192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.429 [INFO][5461] ipam/ipam.go 158: Attempting to load block cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.456 [INFO][5461] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.121.0/26 host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.456 [INFO][5461] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.121.0/26 handle="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.466 [INFO][5461] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9 Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.492 [INFO][5461] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.121.0/26 handle="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.539 [INFO][5461] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.121.8/26] block=192.168.121.0/26 handle="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.539 [INFO][5461] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.121.8/26] handle="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" host="ip-172-31-31-172" Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.539 [INFO][5461] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:26:06.706641 containerd[2008]: 2025-09-16 04:26:06.539 [INFO][5461] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.121.8/26] IPv6=[] ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" HandleID="k8s-pod-network.a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Workload="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.559 [INFO][5424] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4961fb5b-738f-4ada-b144-df52d29e76d0", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"", Pod:"coredns-674b8bbfcf-8vlcr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali85b6cd8f672", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.561 [INFO][5424] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.8/32] ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.562 [INFO][5424] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85b6cd8f672 ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.630 [INFO][5424] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.633 [INFO][5424] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4961fb5b-738f-4ada-b144-df52d29e76d0", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 25, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-172", ContainerID:"a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9", Pod:"coredns-674b8bbfcf-8vlcr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali85b6cd8f672", MAC:"06:7c:58:ef:9b:2b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:26:06.708724 containerd[2008]: 2025-09-16 04:26:06.678 [INFO][5424] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8vlcr" WorkloadEndpoint="ip--172--31--31--172-k8s-coredns--674b8bbfcf--8vlcr-eth0" Sep 16 04:26:06.724965 systemd-networkd[1820]: calia56a664c1e1: Gained IPv6LL Sep 16 04:26:06.769656 systemd[1]: Started cri-containerd-daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4.scope - libcontainer container daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4. Sep 16 04:26:06.788971 systemd-networkd[1820]: calif20c748e23f: Gained IPv6LL Sep 16 04:26:06.867224 containerd[2008]: time="2025-09-16T04:26:06.866653076Z" level=info msg="connecting to shim a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9" address="unix:///run/containerd/s/7cdeb3cf88976b5e110023f0dc96e44ac03fe6954f1e911e2dd3781625b4614f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:26:06.972311 sshd[5494]: Connection closed by 147.75.109.163 port 38842 Sep 16 04:26:06.973237 sshd-session[5477]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:06.989323 systemd[1]: sshd@9-172.31.31.172:22-147.75.109.163:38842.service: Deactivated successfully. Sep 16 04:26:06.997548 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 04:26:07.004535 systemd-logind[1971]: Session 10 logged out. Waiting for processes to exit. Sep 16 04:26:07.012639 systemd-logind[1971]: Removed session 10. Sep 16 04:26:07.045916 systemd[1]: Started cri-containerd-a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9.scope - libcontainer container a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9. Sep 16 04:26:07.147436 containerd[2008]: time="2025-09-16T04:26:07.147288150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s2zqb,Uid:faaa3987-239e-436b-9a5d-37bf1f542a64,Namespace:calico-system,Attempt:0,} returns sandbox id \"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4\"" Sep 16 04:26:07.266293 containerd[2008]: time="2025-09-16T04:26:07.265993818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8vlcr,Uid:4961fb5b-738f-4ada-b144-df52d29e76d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9\"" Sep 16 04:26:07.283796 containerd[2008]: time="2025-09-16T04:26:07.283651638Z" level=info msg="CreateContainer within sandbox \"a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:26:07.320890 containerd[2008]: time="2025-09-16T04:26:07.319967203Z" level=info msg="Container 776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:07.338093 containerd[2008]: time="2025-09-16T04:26:07.338033683Z" level=info msg="CreateContainer within sandbox \"a00ac1d2a3296ceceef9f6cdd10522eb8fc6473c76fbe38624818ffeec896db9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4\"" Sep 16 04:26:07.342546 containerd[2008]: time="2025-09-16T04:26:07.342478159Z" level=info msg="StartContainer for \"776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4\"" Sep 16 04:26:07.352396 containerd[2008]: time="2025-09-16T04:26:07.352199251Z" level=info msg="connecting to shim 776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4" address="unix:///run/containerd/s/7cdeb3cf88976b5e110023f0dc96e44ac03fe6954f1e911e2dd3781625b4614f" protocol=ttrpc version=3 Sep 16 04:26:07.434288 systemd[1]: Started cri-containerd-776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4.scope - libcontainer container 776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4. Sep 16 04:26:07.484171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount434220511.mount: Deactivated successfully. Sep 16 04:26:07.494277 systemd-networkd[1820]: cali8c24896254e: Gained IPv6LL Sep 16 04:26:07.777288 containerd[2008]: time="2025-09-16T04:26:07.776980557Z" level=info msg="StartContainer for \"776774487209600c38bd45a85abdfdd466302ca90837b37540e67255944c0ab4\" returns successfully" Sep 16 04:26:07.977144 kubelet[3544]: I0916 04:26:07.977030 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-8vlcr" podStartSLOduration=51.977005162 podStartE2EDuration="51.977005162s" podCreationTimestamp="2025-09-16 04:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:26:07.97268959 +0000 UTC m=+52.840333332" watchObservedRunningTime="2025-09-16 04:26:07.977005162 +0000 UTC m=+52.844648940" Sep 16 04:26:08.005156 systemd-networkd[1820]: cali85b6cd8f672: Gained IPv6LL Sep 16 04:26:08.702765 containerd[2008]: time="2025-09-16T04:26:08.702671781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:08.705170 containerd[2008]: time="2025-09-16T04:26:08.704778009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 16 04:26:08.707700 containerd[2008]: time="2025-09-16T04:26:08.707637453Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:08.714237 containerd[2008]: time="2025-09-16T04:26:08.714165981Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:08.718392 containerd[2008]: time="2025-09-16T04:26:08.718159077Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 4.771771571s" Sep 16 04:26:08.718392 containerd[2008]: time="2025-09-16T04:26:08.718243293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:26:08.725267 containerd[2008]: time="2025-09-16T04:26:08.723491553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 04:26:08.734591 containerd[2008]: time="2025-09-16T04:26:08.734516782Z" level=info msg="CreateContainer within sandbox \"5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:26:08.754517 containerd[2008]: time="2025-09-16T04:26:08.754445314Z" level=info msg="Container 39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:08.770226 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3571791131.mount: Deactivated successfully. Sep 16 04:26:08.780770 containerd[2008]: time="2025-09-16T04:26:08.780688762Z" level=info msg="CreateContainer within sandbox \"5b931698bad692a1ea7e3107f7ee3f7b4d89d0c9e7b33784ef673688c527abcc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577\"" Sep 16 04:26:08.782483 containerd[2008]: time="2025-09-16T04:26:08.782422258Z" level=info msg="StartContainer for \"39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577\"" Sep 16 04:26:08.787675 containerd[2008]: time="2025-09-16T04:26:08.787582090Z" level=info msg="connecting to shim 39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577" address="unix:///run/containerd/s/473252df85f7cf366eb6bc641ac77284715c10b9d11eee42f83429ba2edbc84c" protocol=ttrpc version=3 Sep 16 04:26:08.875038 systemd[1]: Started cri-containerd-39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577.scope - libcontainer container 39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577. Sep 16 04:26:08.989388 containerd[2008]: time="2025-09-16T04:26:08.989038271Z" level=info msg="StartContainer for \"39ec797eab7a54dc1d80aaba2060adc9d64ae08b0782ce453e0e82713b8d8577\" returns successfully" Sep 16 04:26:09.983890 kubelet[3544]: I0916 04:26:09.983113 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-65cb54b74c-8d9jm" podStartSLOduration=33.628643274 podStartE2EDuration="38.983088468s" podCreationTimestamp="2025-09-16 04:25:31 +0000 UTC" firstStartedPulling="2025-09-16 04:26:03.368615823 +0000 UTC m=+48.236259577" lastFinishedPulling="2025-09-16 04:26:08.723061029 +0000 UTC m=+53.590704771" observedRunningTime="2025-09-16 04:26:09.978052836 +0000 UTC m=+54.845696602" watchObservedRunningTime="2025-09-16 04:26:09.983088468 +0000 UTC m=+54.850732234" Sep 16 04:26:10.049169 ntpd[2190]: Listen normally on 6 vxlan.calico 192.168.121.0:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 6 vxlan.calico 192.168.121.0:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 7 vxlan.calico [fe80::64b8:5bff:fea8:b59c%4]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 8 cali33369db9781 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 9 cali47778079837 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 10 cali18c92766682 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 11 cali0a35d63087f [fe80::ecee:eeff:feee:eeee%10]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 12 calif20c748e23f [fe80::ecee:eeff:feee:eeee%11]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 13 calia56a664c1e1 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 14 cali8c24896254e [fe80::ecee:eeff:feee:eeee%13]:123 Sep 16 04:26:10.051567 ntpd[2190]: 16 Sep 04:26:10 ntpd[2190]: Listen normally on 15 cali85b6cd8f672 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 16 04:26:10.049255 ntpd[2190]: Listen normally on 7 vxlan.calico [fe80::64b8:5bff:fea8:b59c%4]:123 Sep 16 04:26:10.050117 ntpd[2190]: Listen normally on 8 cali33369db9781 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 16 04:26:10.050180 ntpd[2190]: Listen normally on 9 cali47778079837 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 16 04:26:10.050227 ntpd[2190]: Listen normally on 10 cali18c92766682 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 16 04:26:10.050272 ntpd[2190]: Listen normally on 11 cali0a35d63087f [fe80::ecee:eeff:feee:eeee%10]:123 Sep 16 04:26:10.050317 ntpd[2190]: Listen normally on 12 calif20c748e23f [fe80::ecee:eeff:feee:eeee%11]:123 Sep 16 04:26:10.050374 ntpd[2190]: Listen normally on 13 calia56a664c1e1 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 16 04:26:10.050424 ntpd[2190]: Listen normally on 14 cali8c24896254e [fe80::ecee:eeff:feee:eeee%13]:123 Sep 16 04:26:10.050470 ntpd[2190]: Listen normally on 15 cali85b6cd8f672 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 16 04:26:10.961977 kubelet[3544]: I0916 04:26:10.961901 3544 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:26:11.422621 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3139108640.mount: Deactivated successfully. Sep 16 04:26:11.477046 containerd[2008]: time="2025-09-16T04:26:11.476970623Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:11.480668 containerd[2008]: time="2025-09-16T04:26:11.480593699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 16 04:26:11.482753 containerd[2008]: time="2025-09-16T04:26:11.482678507Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:11.491407 containerd[2008]: time="2025-09-16T04:26:11.491326607Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:11.494480 containerd[2008]: time="2025-09-16T04:26:11.494378327Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.770772942s" Sep 16 04:26:11.494480 containerd[2008]: time="2025-09-16T04:26:11.494450231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 16 04:26:11.498964 containerd[2008]: time="2025-09-16T04:26:11.498512471Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 04:26:11.512125 containerd[2008]: time="2025-09-16T04:26:11.511967807Z" level=info msg="CreateContainer within sandbox \"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 04:26:11.539640 containerd[2008]: time="2025-09-16T04:26:11.538396559Z" level=info msg="Container 21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:11.571137 containerd[2008]: time="2025-09-16T04:26:11.570829824Z" level=info msg="CreateContainer within sandbox \"8d25e3eae2200e4952eea05bbe06d1896ffd4c7f65923a4535a051352cd6abca\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333\"" Sep 16 04:26:11.574019 containerd[2008]: time="2025-09-16T04:26:11.573959004Z" level=info msg="StartContainer for \"21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333\"" Sep 16 04:26:11.578210 containerd[2008]: time="2025-09-16T04:26:11.577995444Z" level=info msg="connecting to shim 21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333" address="unix:///run/containerd/s/cb0a71c1f31d843e676616859d4469e1bc17fbc7ffa1d0a52e1713ab6499c5d6" protocol=ttrpc version=3 Sep 16 04:26:11.635455 systemd[1]: Started cri-containerd-21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333.scope - libcontainer container 21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333. Sep 16 04:26:11.784280 containerd[2008]: time="2025-09-16T04:26:11.784214377Z" level=info msg="StartContainer for \"21c43f51d9e98f5c48f22c7fed117526f634ca2fcd77e29c9c36e0f13df1c333\" returns successfully" Sep 16 04:26:11.972257 kubelet[3544]: I0916 04:26:11.972173 3544 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:26:11.999070 kubelet[3544]: I0916 04:26:11.998511 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-58c66ff65c-dznkv" podStartSLOduration=3.838066601 podStartE2EDuration="12.998454494s" podCreationTimestamp="2025-09-16 04:25:59 +0000 UTC" firstStartedPulling="2025-09-16 04:26:02.337394642 +0000 UTC m=+47.205038384" lastFinishedPulling="2025-09-16 04:26:11.497782511 +0000 UTC m=+56.365426277" observedRunningTime="2025-09-16 04:26:11.997010666 +0000 UTC m=+56.864654408" watchObservedRunningTime="2025-09-16 04:26:11.998454494 +0000 UTC m=+56.866098260" Sep 16 04:26:12.013236 systemd[1]: Started sshd@10-172.31.31.172:22-147.75.109.163:34452.service - OpenSSH per-connection server daemon (147.75.109.163:34452). Sep 16 04:26:12.266550 sshd[5733]: Accepted publickey for core from 147.75.109.163 port 34452 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:12.272706 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:12.288212 systemd-logind[1971]: New session 11 of user core. Sep 16 04:26:12.298158 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 04:26:12.696381 sshd[5741]: Connection closed by 147.75.109.163 port 34452 Sep 16 04:26:12.697366 sshd-session[5733]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:12.707244 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 04:26:12.712151 systemd[1]: sshd@10-172.31.31.172:22-147.75.109.163:34452.service: Deactivated successfully. Sep 16 04:26:12.722598 systemd-logind[1971]: Session 11 logged out. Waiting for processes to exit. Sep 16 04:26:12.725776 systemd-logind[1971]: Removed session 11. Sep 16 04:26:17.487978 containerd[2008]: time="2025-09-16T04:26:17.487902197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:17.491934 containerd[2008]: time="2025-09-16T04:26:17.491866373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 16 04:26:17.493802 containerd[2008]: time="2025-09-16T04:26:17.493609613Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:17.502810 containerd[2008]: time="2025-09-16T04:26:17.502334741Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:17.504418 containerd[2008]: time="2025-09-16T04:26:17.503584217Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 6.00408975s" Sep 16 04:26:17.504418 containerd[2008]: time="2025-09-16T04:26:17.503643341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 16 04:26:17.506515 containerd[2008]: time="2025-09-16T04:26:17.506466785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 04:26:17.558044 containerd[2008]: time="2025-09-16T04:26:17.557988797Z" level=info msg="CreateContainer within sandbox \"d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 04:26:17.575712 containerd[2008]: time="2025-09-16T04:26:17.575648621Z" level=info msg="Container 65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:17.602710 containerd[2008]: time="2025-09-16T04:26:17.602602590Z" level=info msg="CreateContainer within sandbox \"d6189fb41e4aee1894678de71ab72b64fabb9e35dbeb7de5016fab629d62c0ca\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\"" Sep 16 04:26:17.606246 containerd[2008]: time="2025-09-16T04:26:17.606173682Z" level=info msg="StartContainer for \"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\"" Sep 16 04:26:17.610880 containerd[2008]: time="2025-09-16T04:26:17.610646886Z" level=info msg="connecting to shim 65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331" address="unix:///run/containerd/s/1256648111d8ce679a8109afdb6cd469e05954e9017d5e83e077e76fac705dfd" protocol=ttrpc version=3 Sep 16 04:26:17.689095 systemd[1]: Started cri-containerd-65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331.scope - libcontainer container 65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331. Sep 16 04:26:17.742410 systemd[1]: Started sshd@11-172.31.31.172:22-147.75.109.163:34460.service - OpenSSH per-connection server daemon (147.75.109.163:34460). Sep 16 04:26:17.949567 containerd[2008]: time="2025-09-16T04:26:17.948685855Z" level=info msg="StartContainer for \"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" returns successfully" Sep 16 04:26:17.998723 sshd[5795]: Accepted publickey for core from 147.75.109.163 port 34460 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:18.004025 sshd-session[5795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:18.024713 systemd-logind[1971]: New session 12 of user core. Sep 16 04:26:18.029380 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 04:26:18.378892 sshd[5813]: Connection closed by 147.75.109.163 port 34460 Sep 16 04:26:18.380007 sshd-session[5795]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:18.388795 systemd[1]: sshd@11-172.31.31.172:22-147.75.109.163:34460.service: Deactivated successfully. Sep 16 04:26:18.396182 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 04:26:18.398838 systemd-logind[1971]: Session 12 logged out. Waiting for processes to exit. Sep 16 04:26:18.425430 systemd[1]: Started sshd@12-172.31.31.172:22-147.75.109.163:34470.service - OpenSSH per-connection server daemon (147.75.109.163:34470). Sep 16 04:26:18.430476 systemd-logind[1971]: Removed session 12. Sep 16 04:26:18.639912 sshd[5833]: Accepted publickey for core from 147.75.109.163 port 34470 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:18.642081 sshd-session[5833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:18.650087 systemd-logind[1971]: New session 13 of user core. Sep 16 04:26:18.664033 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 04:26:19.102130 sshd[5836]: Connection closed by 147.75.109.163 port 34470 Sep 16 04:26:19.104034 sshd-session[5833]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:19.121319 systemd-logind[1971]: Session 13 logged out. Waiting for processes to exit. Sep 16 04:26:19.122672 systemd[1]: sshd@12-172.31.31.172:22-147.75.109.163:34470.service: Deactivated successfully. Sep 16 04:26:19.130628 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 04:26:19.168875 systemd-logind[1971]: Removed session 13. Sep 16 04:26:19.173848 systemd[1]: Started sshd@13-172.31.31.172:22-147.75.109.163:34474.service - OpenSSH per-connection server daemon (147.75.109.163:34474). Sep 16 04:26:19.230360 containerd[2008]: time="2025-09-16T04:26:19.229901082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" id:\"b120eece43cf614026fac1b7210fca53aa8941000db94aded79eafdc45b6efdc\" pid:5854 exited_at:{seconds:1757996779 nanos:228024330}" Sep 16 04:26:19.274481 kubelet[3544]: I0916 04:26:19.274034 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8586679c6f-w2bsg" podStartSLOduration=27.296061258 podStartE2EDuration="40.273984174s" podCreationTimestamp="2025-09-16 04:25:39 +0000 UTC" firstStartedPulling="2025-09-16 04:26:04.528063149 +0000 UTC m=+49.395706903" lastFinishedPulling="2025-09-16 04:26:17.505986077 +0000 UTC m=+62.373629819" observedRunningTime="2025-09-16 04:26:18.048323476 +0000 UTC m=+62.915967254" watchObservedRunningTime="2025-09-16 04:26:19.273984174 +0000 UTC m=+64.141627916" Sep 16 04:26:19.406977 sshd[5864]: Accepted publickey for core from 147.75.109.163 port 34474 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:19.410505 sshd-session[5864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:19.422652 systemd-logind[1971]: New session 14 of user core. Sep 16 04:26:19.431124 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 04:26:19.719832 sshd[5870]: Connection closed by 147.75.109.163 port 34474 Sep 16 04:26:19.721500 sshd-session[5864]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:19.731569 systemd[1]: sshd@13-172.31.31.172:22-147.75.109.163:34474.service: Deactivated successfully. Sep 16 04:26:19.738242 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 04:26:19.742876 systemd-logind[1971]: Session 14 logged out. Waiting for processes to exit. Sep 16 04:26:19.745285 systemd-logind[1971]: Removed session 14. Sep 16 04:26:22.835035 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1489251253.mount: Deactivated successfully. Sep 16 04:26:23.578675 containerd[2008]: time="2025-09-16T04:26:23.578247371Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:23.580694 containerd[2008]: time="2025-09-16T04:26:23.580420487Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 16 04:26:23.582595 containerd[2008]: time="2025-09-16T04:26:23.582519035Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:23.588862 containerd[2008]: time="2025-09-16T04:26:23.588795995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:23.591404 containerd[2008]: time="2025-09-16T04:26:23.590440583Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 6.083167002s" Sep 16 04:26:23.591404 containerd[2008]: time="2025-09-16T04:26:23.590500067Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 16 04:26:23.593207 containerd[2008]: time="2025-09-16T04:26:23.593137451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:26:23.602037 containerd[2008]: time="2025-09-16T04:26:23.601714187Z" level=info msg="CreateContainer within sandbox \"4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 04:26:23.620773 containerd[2008]: time="2025-09-16T04:26:23.617776763Z" level=info msg="Container d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:23.643156 containerd[2008]: time="2025-09-16T04:26:23.643074720Z" level=info msg="CreateContainer within sandbox \"4da2cfa88df3daf97b790b47e8da90f41dbb6472770b07a4846bf90b8b84fe7d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\"" Sep 16 04:26:23.646036 containerd[2008]: time="2025-09-16T04:26:23.645984240Z" level=info msg="StartContainer for \"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\"" Sep 16 04:26:23.648774 containerd[2008]: time="2025-09-16T04:26:23.648640740Z" level=info msg="connecting to shim d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a" address="unix:///run/containerd/s/4b27a8ccc7281b0c7df23f6f835c1bb6f7bd6839d3df80b130d486e22c707a1c" protocol=ttrpc version=3 Sep 16 04:26:23.700422 systemd[1]: Started cri-containerd-d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a.scope - libcontainer container d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a. Sep 16 04:26:23.830969 containerd[2008]: time="2025-09-16T04:26:23.830592001Z" level=info msg="StartContainer for \"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" returns successfully" Sep 16 04:26:23.934550 containerd[2008]: time="2025-09-16T04:26:23.934414837Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:23.936606 containerd[2008]: time="2025-09-16T04:26:23.936511981Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 04:26:23.943712 containerd[2008]: time="2025-09-16T04:26:23.943626157Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 349.820426ms" Sep 16 04:26:23.943712 containerd[2008]: time="2025-09-16T04:26:23.943698589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:26:23.948473 containerd[2008]: time="2025-09-16T04:26:23.948116833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 04:26:23.954608 containerd[2008]: time="2025-09-16T04:26:23.954558109Z" level=info msg="CreateContainer within sandbox \"7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:26:23.977766 containerd[2008]: time="2025-09-16T04:26:23.975058273Z" level=info msg="Container 51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:24.000863 containerd[2008]: time="2025-09-16T04:26:24.000689409Z" level=info msg="CreateContainer within sandbox \"7140d7e1f7491dc00f607fc659c7b4a2b554ed314965abadc9bad1761bcea36f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090\"" Sep 16 04:26:24.002695 containerd[2008]: time="2025-09-16T04:26:24.002619453Z" level=info msg="StartContainer for \"51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090\"" Sep 16 04:26:24.009815 containerd[2008]: time="2025-09-16T04:26:24.009635361Z" level=info msg="connecting to shim 51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090" address="unix:///run/containerd/s/ab36b42161c22917ea68812d860bbac87b6f0c201fdbf93bec97c0a45237e432" protocol=ttrpc version=3 Sep 16 04:26:24.068225 systemd[1]: Started cri-containerd-51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090.scope - libcontainer container 51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090. Sep 16 04:26:24.096821 kubelet[3544]: I0916 04:26:24.096285 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-hlwrv" podStartSLOduration=27.946467392 podStartE2EDuration="46.096259486s" podCreationTimestamp="2025-09-16 04:25:38 +0000 UTC" firstStartedPulling="2025-09-16 04:26:05.442916333 +0000 UTC m=+50.310560087" lastFinishedPulling="2025-09-16 04:26:23.592708343 +0000 UTC m=+68.460352181" observedRunningTime="2025-09-16 04:26:24.089455342 +0000 UTC m=+68.957099108" watchObservedRunningTime="2025-09-16 04:26:24.096259486 +0000 UTC m=+68.963903228" Sep 16 04:26:24.332724 containerd[2008]: time="2025-09-16T04:26:24.332636567Z" level=info msg="StartContainer for \"51eafbdbb079a4b84c15e37c6b6233a5e141b1a6702a4212ce9708028b975090\" returns successfully" Sep 16 04:26:24.408275 containerd[2008]: time="2025-09-16T04:26:24.408127967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"6aa91db61952059007e16cf02f9239a9ae001bbd4e6b95dc0f3feaea69ef2cc8\" pid:5962 exit_status:1 exited_at:{seconds:1757996784 nanos:403649327}" Sep 16 04:26:24.763335 systemd[1]: Started sshd@14-172.31.31.172:22-147.75.109.163:58852.service - OpenSSH per-connection server daemon (147.75.109.163:58852). Sep 16 04:26:24.988324 sshd[6003]: Accepted publickey for core from 147.75.109.163 port 58852 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:24.993628 sshd-session[6003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:25.010497 systemd-logind[1971]: New session 15 of user core. Sep 16 04:26:25.016292 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 04:26:25.118155 kubelet[3544]: I0916 04:26:25.118064 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-65cb54b74c-f9hzv" podStartSLOduration=36.365051659 podStartE2EDuration="54.118017035s" podCreationTimestamp="2025-09-16 04:25:31 +0000 UTC" firstStartedPulling="2025-09-16 04:26:06.192666917 +0000 UTC m=+51.060310683" lastFinishedPulling="2025-09-16 04:26:23.945632305 +0000 UTC m=+68.813276059" observedRunningTime="2025-09-16 04:26:25.116764655 +0000 UTC m=+69.984408493" watchObservedRunningTime="2025-09-16 04:26:25.118017035 +0000 UTC m=+69.985660861" Sep 16 04:26:25.474180 containerd[2008]: time="2025-09-16T04:26:25.474107845Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:25.479996 sshd[6010]: Connection closed by 147.75.109.163 port 58852 Sep 16 04:26:25.481302 sshd-session[6003]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:25.491229 containerd[2008]: time="2025-09-16T04:26:25.489015433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 16 04:26:25.497816 containerd[2008]: time="2025-09-16T04:26:25.497671825Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:25.498470 systemd[1]: sshd@14-172.31.31.172:22-147.75.109.163:58852.service: Deactivated successfully. Sep 16 04:26:25.509140 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 04:26:25.515141 systemd-logind[1971]: Session 15 logged out. Waiting for processes to exit. Sep 16 04:26:25.524314 systemd-logind[1971]: Removed session 15. Sep 16 04:26:25.532566 containerd[2008]: time="2025-09-16T04:26:25.532487353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:25.544092 containerd[2008]: time="2025-09-16T04:26:25.544020109Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.595832308s" Sep 16 04:26:25.544092 containerd[2008]: time="2025-09-16T04:26:25.544085245Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 16 04:26:25.557884 containerd[2008]: time="2025-09-16T04:26:25.557815609Z" level=info msg="CreateContainer within sandbox \"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 04:26:25.597051 containerd[2008]: time="2025-09-16T04:26:25.596976397Z" level=info msg="Container 49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:25.632603 containerd[2008]: time="2025-09-16T04:26:25.632511985Z" level=info msg="CreateContainer within sandbox \"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c\"" Sep 16 04:26:25.634242 containerd[2008]: time="2025-09-16T04:26:25.634176937Z" level=info msg="StartContainer for \"49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c\"" Sep 16 04:26:25.637995 containerd[2008]: time="2025-09-16T04:26:25.637891382Z" level=info msg="connecting to shim 49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c" address="unix:///run/containerd/s/1cac349f69a88d044b56da0854499220f6b4615498b23c01227e61af86898036" protocol=ttrpc version=3 Sep 16 04:26:25.751013 containerd[2008]: time="2025-09-16T04:26:25.750033242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"b70932fb0f497cef7be80f8d693d10dfe619fbbfce18eac4ece80171a5fda8f0\" pid:6032 exit_status:1 exited_at:{seconds:1757996785 nanos:741713834}" Sep 16 04:26:25.756042 systemd[1]: Started cri-containerd-49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c.scope - libcontainer container 49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c. Sep 16 04:26:25.919560 containerd[2008]: time="2025-09-16T04:26:25.919492551Z" level=info msg="StartContainer for \"49029b388ba52053316477f2afe5ecf77c407d5ba32fe11297d093c9c3dc964c\" returns successfully" Sep 16 04:26:25.936621 containerd[2008]: time="2025-09-16T04:26:25.936233691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 04:26:26.276018 containerd[2008]: time="2025-09-16T04:26:26.275944885Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"d0eabd6b6c56298b351a782e064eab2684ee3d348b71751f90701f2570594ddb\" pid:6091 exit_status:1 exited_at:{seconds:1757996786 nanos:275443465}" Sep 16 04:26:27.102527 kubelet[3544]: I0916 04:26:27.102393 3544 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:26:27.513514 containerd[2008]: time="2025-09-16T04:26:27.511590735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:27.513514 containerd[2008]: time="2025-09-16T04:26:27.513396879Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 16 04:26:27.515806 containerd[2008]: time="2025-09-16T04:26:27.515694243Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:27.526311 containerd[2008]: time="2025-09-16T04:26:27.526242579Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:26:27.529545 containerd[2008]: time="2025-09-16T04:26:27.529454175Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.593154136s" Sep 16 04:26:27.529885 containerd[2008]: time="2025-09-16T04:26:27.529710795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 16 04:26:27.542629 containerd[2008]: time="2025-09-16T04:26:27.542571003Z" level=info msg="CreateContainer within sandbox \"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 04:26:27.564102 containerd[2008]: time="2025-09-16T04:26:27.564023271Z" level=info msg="Container a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:26:27.589857 containerd[2008]: time="2025-09-16T04:26:27.589775343Z" level=info msg="CreateContainer within sandbox \"daabc3f965b64f1af1eedd637fcc4f30a0916deb8fb58c5a7f396e8cea9d35e4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f\"" Sep 16 04:26:27.590908 containerd[2008]: time="2025-09-16T04:26:27.590837403Z" level=info msg="StartContainer for \"a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f\"" Sep 16 04:26:27.594630 containerd[2008]: time="2025-09-16T04:26:27.594569775Z" level=info msg="connecting to shim a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f" address="unix:///run/containerd/s/1cac349f69a88d044b56da0854499220f6b4615498b23c01227e61af86898036" protocol=ttrpc version=3 Sep 16 04:26:27.661094 systemd[1]: Started cri-containerd-a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f.scope - libcontainer container a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f. Sep 16 04:26:27.776289 containerd[2008]: time="2025-09-16T04:26:27.776224708Z" level=info msg="StartContainer for \"a87eda5d7bfa30fb7cd22232789e921eb82f5a19ff58eb4576dd082e5c060c5f\" returns successfully" Sep 16 04:26:28.692402 kubelet[3544]: I0916 04:26:28.692213 3544 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 04:26:28.692402 kubelet[3544]: I0916 04:26:28.692270 3544 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 04:26:30.519329 systemd[1]: Started sshd@15-172.31.31.172:22-147.75.109.163:39798.service - OpenSSH per-connection server daemon (147.75.109.163:39798). Sep 16 04:26:30.727708 sshd[6145]: Accepted publickey for core from 147.75.109.163 port 39798 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:30.730680 sshd-session[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:30.739595 systemd-logind[1971]: New session 16 of user core. Sep 16 04:26:30.747059 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 04:26:31.039535 sshd[6148]: Connection closed by 147.75.109.163 port 39798 Sep 16 04:26:31.040127 sshd-session[6145]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:31.048046 systemd[1]: sshd@15-172.31.31.172:22-147.75.109.163:39798.service: Deactivated successfully. Sep 16 04:26:31.054441 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 04:26:31.058884 systemd-logind[1971]: Session 16 logged out. Waiting for processes to exit. Sep 16 04:26:31.063110 systemd-logind[1971]: Removed session 16. Sep 16 04:26:32.033909 containerd[2008]: time="2025-09-16T04:26:32.033844505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"b0d82c31c440e0bc540ce4ef59a75092980068ea4f7539f754880c3848d5c217\" pid:6172 exited_at:{seconds:1757996792 nanos:33035945}" Sep 16 04:26:32.091534 kubelet[3544]: I0916 04:26:32.091390 3544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-s2zqb" podStartSLOduration=32.704879725 podStartE2EDuration="53.085238538s" podCreationTimestamp="2025-09-16 04:25:39 +0000 UTC" firstStartedPulling="2025-09-16 04:26:07.152362782 +0000 UTC m=+52.020006524" lastFinishedPulling="2025-09-16 04:26:27.532721595 +0000 UTC m=+72.400365337" observedRunningTime="2025-09-16 04:26:28.137230862 +0000 UTC m=+73.004874604" watchObservedRunningTime="2025-09-16 04:26:32.085238538 +0000 UTC m=+76.952882292" Sep 16 04:26:36.077863 systemd[1]: Started sshd@16-172.31.31.172:22-147.75.109.163:39802.service - OpenSSH per-connection server daemon (147.75.109.163:39802). Sep 16 04:26:36.310181 sshd[6189]: Accepted publickey for core from 147.75.109.163 port 39802 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:36.312155 sshd-session[6189]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:36.325298 systemd-logind[1971]: New session 17 of user core. Sep 16 04:26:36.332131 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 04:26:36.759312 sshd[6192]: Connection closed by 147.75.109.163 port 39802 Sep 16 04:26:36.760436 sshd-session[6189]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:36.774552 systemd[1]: sshd@16-172.31.31.172:22-147.75.109.163:39802.service: Deactivated successfully. Sep 16 04:26:36.782726 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 04:26:36.790792 systemd-logind[1971]: Session 17 logged out. Waiting for processes to exit. Sep 16 04:26:36.794004 systemd-logind[1971]: Removed session 17. Sep 16 04:26:37.401131 containerd[2008]: time="2025-09-16T04:26:37.400944936Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"7b62fcbe3ad229ec2d52599c6a6ab45e180105f26b8ae75d76c0f4663d65c172\" pid:6215 exited_at:{seconds:1757996797 nanos:400188480}" Sep 16 04:26:41.798887 systemd[1]: Started sshd@17-172.31.31.172:22-147.75.109.163:40766.service - OpenSSH per-connection server daemon (147.75.109.163:40766). Sep 16 04:26:42.026029 sshd[6227]: Accepted publickey for core from 147.75.109.163 port 40766 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:42.028371 sshd-session[6227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:42.044432 systemd-logind[1971]: New session 18 of user core. Sep 16 04:26:42.050189 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 04:26:42.335279 sshd[6230]: Connection closed by 147.75.109.163 port 40766 Sep 16 04:26:42.336592 sshd-session[6227]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:42.344820 systemd[1]: sshd@17-172.31.31.172:22-147.75.109.163:40766.service: Deactivated successfully. Sep 16 04:26:42.348985 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 04:26:42.353488 systemd-logind[1971]: Session 18 logged out. Waiting for processes to exit. Sep 16 04:26:42.357001 systemd-logind[1971]: Removed session 18. Sep 16 04:26:42.371625 systemd[1]: Started sshd@18-172.31.31.172:22-147.75.109.163:40768.service - OpenSSH per-connection server daemon (147.75.109.163:40768). Sep 16 04:26:42.577720 sshd[6242]: Accepted publickey for core from 147.75.109.163 port 40768 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:42.580120 sshd-session[6242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:42.588842 systemd-logind[1971]: New session 19 of user core. Sep 16 04:26:42.597061 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 04:26:43.226869 sshd[6251]: Connection closed by 147.75.109.163 port 40768 Sep 16 04:26:43.227830 sshd-session[6242]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:43.237781 systemd[1]: sshd@18-172.31.31.172:22-147.75.109.163:40768.service: Deactivated successfully. Sep 16 04:26:43.243009 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 04:26:43.245059 systemd-logind[1971]: Session 19 logged out. Waiting for processes to exit. Sep 16 04:26:43.261771 systemd-logind[1971]: Removed session 19. Sep 16 04:26:43.265219 systemd[1]: Started sshd@19-172.31.31.172:22-147.75.109.163:40774.service - OpenSSH per-connection server daemon (147.75.109.163:40774). Sep 16 04:26:43.464954 sshd[6261]: Accepted publickey for core from 147.75.109.163 port 40774 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:43.466994 sshd-session[6261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:43.476187 systemd-logind[1971]: New session 20 of user core. Sep 16 04:26:43.484022 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 16 04:26:44.787058 sshd[6264]: Connection closed by 147.75.109.163 port 40774 Sep 16 04:26:44.788332 sshd-session[6261]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:44.798672 systemd[1]: sshd@19-172.31.31.172:22-147.75.109.163:40774.service: Deactivated successfully. Sep 16 04:26:44.809424 systemd[1]: session-20.scope: Deactivated successfully. Sep 16 04:26:44.812729 systemd-logind[1971]: Session 20 logged out. Waiting for processes to exit. Sep 16 04:26:44.836249 systemd[1]: Started sshd@20-172.31.31.172:22-147.75.109.163:40780.service - OpenSSH per-connection server daemon (147.75.109.163:40780). Sep 16 04:26:44.840335 systemd-logind[1971]: Removed session 20. Sep 16 04:26:45.049963 sshd[6283]: Accepted publickey for core from 147.75.109.163 port 40780 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:45.052989 sshd-session[6283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:45.061059 systemd-logind[1971]: New session 21 of user core. Sep 16 04:26:45.069238 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 16 04:26:45.635309 sshd[6288]: Connection closed by 147.75.109.163 port 40780 Sep 16 04:26:45.636660 sshd-session[6283]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:45.647419 systemd[1]: sshd@20-172.31.31.172:22-147.75.109.163:40780.service: Deactivated successfully. Sep 16 04:26:45.651349 systemd[1]: session-21.scope: Deactivated successfully. Sep 16 04:26:45.653826 systemd-logind[1971]: Session 21 logged out. Waiting for processes to exit. Sep 16 04:26:45.657711 systemd-logind[1971]: Removed session 21. Sep 16 04:26:45.683652 systemd[1]: Started sshd@21-172.31.31.172:22-147.75.109.163:40790.service - OpenSSH per-connection server daemon (147.75.109.163:40790). Sep 16 04:26:45.894375 sshd[6298]: Accepted publickey for core from 147.75.109.163 port 40790 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:45.898597 sshd-session[6298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:45.907295 systemd-logind[1971]: New session 22 of user core. Sep 16 04:26:45.913996 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 16 04:26:46.160562 sshd[6301]: Connection closed by 147.75.109.163 port 40790 Sep 16 04:26:46.159655 sshd-session[6298]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:46.167407 systemd-logind[1971]: Session 22 logged out. Waiting for processes to exit. Sep 16 04:26:46.169014 systemd[1]: sshd@21-172.31.31.172:22-147.75.109.163:40790.service: Deactivated successfully. Sep 16 04:26:46.174979 systemd[1]: session-22.scope: Deactivated successfully. Sep 16 04:26:46.182892 systemd-logind[1971]: Removed session 22. Sep 16 04:26:49.074067 containerd[2008]: time="2025-09-16T04:26:49.073994746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" id:\"05886cc85447c5f7111e1ee41e122b5c3cea51db791f4407962b1f580d76ff36\" pid:6327 exited_at:{seconds:1757996809 nanos:73499722}" Sep 16 04:26:51.202185 systemd[1]: Started sshd@22-172.31.31.172:22-147.75.109.163:34798.service - OpenSSH per-connection server daemon (147.75.109.163:34798). Sep 16 04:26:51.399030 sshd[6339]: Accepted publickey for core from 147.75.109.163 port 34798 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:51.401465 sshd-session[6339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:51.411056 systemd-logind[1971]: New session 23 of user core. Sep 16 04:26:51.418038 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 16 04:26:51.678166 sshd[6342]: Connection closed by 147.75.109.163 port 34798 Sep 16 04:26:51.678045 sshd-session[6339]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:51.685417 systemd[1]: sshd@22-172.31.31.172:22-147.75.109.163:34798.service: Deactivated successfully. Sep 16 04:26:51.691085 systemd[1]: session-23.scope: Deactivated successfully. Sep 16 04:26:51.693875 systemd-logind[1971]: Session 23 logged out. Waiting for processes to exit. Sep 16 04:26:51.697516 systemd-logind[1971]: Removed session 23. Sep 16 04:26:56.224990 containerd[2008]: time="2025-09-16T04:26:56.224691317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"5ff833b07d5c565064a284e73aeff9ff40ebe8788f3da02b4d043ac981689d40\" pid:6367 exited_at:{seconds:1757996816 nanos:223090349}" Sep 16 04:26:56.721344 systemd[1]: Started sshd@23-172.31.31.172:22-147.75.109.163:34814.service - OpenSSH per-connection server daemon (147.75.109.163:34814). Sep 16 04:26:56.941939 sshd[6381]: Accepted publickey for core from 147.75.109.163 port 34814 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:26:56.944465 sshd-session[6381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:56.952951 systemd-logind[1971]: New session 24 of user core. Sep 16 04:26:56.964069 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 16 04:26:57.269388 sshd[6384]: Connection closed by 147.75.109.163 port 34814 Sep 16 04:26:57.271372 sshd-session[6381]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:57.280220 systemd[1]: sshd@23-172.31.31.172:22-147.75.109.163:34814.service: Deactivated successfully. Sep 16 04:26:57.289023 systemd[1]: session-24.scope: Deactivated successfully. Sep 16 04:26:57.291057 systemd-logind[1971]: Session 24 logged out. Waiting for processes to exit. Sep 16 04:26:57.295652 systemd-logind[1971]: Removed session 24. Sep 16 04:27:02.134197 containerd[2008]: time="2025-09-16T04:27:02.134128331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"65910366354dc644b0aeaccdaf4e08f73b97d201f322ac7b19618278afe954fe\" pid:6406 exited_at:{seconds:1757996822 nanos:133510619}" Sep 16 04:27:02.313257 systemd[1]: Started sshd@24-172.31.31.172:22-147.75.109.163:59428.service - OpenSSH per-connection server daemon (147.75.109.163:59428). Sep 16 04:27:02.529232 sshd[6419]: Accepted publickey for core from 147.75.109.163 port 59428 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:27:02.532218 sshd-session[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:02.543131 systemd-logind[1971]: New session 25 of user core. Sep 16 04:27:02.549104 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 16 04:27:02.862597 sshd[6422]: Connection closed by 147.75.109.163 port 59428 Sep 16 04:27:02.866089 sshd-session[6419]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:02.876013 systemd[1]: sshd@24-172.31.31.172:22-147.75.109.163:59428.service: Deactivated successfully. Sep 16 04:27:02.884152 systemd[1]: session-25.scope: Deactivated successfully. Sep 16 04:27:02.896691 systemd-logind[1971]: Session 25 logged out. Waiting for processes to exit. Sep 16 04:27:02.900687 systemd-logind[1971]: Removed session 25. Sep 16 04:27:07.905305 systemd[1]: Started sshd@25-172.31.31.172:22-147.75.109.163:59444.service - OpenSSH per-connection server daemon (147.75.109.163:59444). Sep 16 04:27:08.113686 sshd[6433]: Accepted publickey for core from 147.75.109.163 port 59444 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:27:08.116146 sshd-session[6433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:08.129006 systemd-logind[1971]: New session 26 of user core. Sep 16 04:27:08.141000 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 16 04:27:08.462310 sshd[6436]: Connection closed by 147.75.109.163 port 59444 Sep 16 04:27:08.457440 sshd-session[6433]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:08.473367 systemd-logind[1971]: Session 26 logged out. Waiting for processes to exit. Sep 16 04:27:08.475159 systemd[1]: sshd@25-172.31.31.172:22-147.75.109.163:59444.service: Deactivated successfully. Sep 16 04:27:08.480923 systemd[1]: session-26.scope: Deactivated successfully. Sep 16 04:27:08.488869 systemd-logind[1971]: Removed session 26. Sep 16 04:27:13.501532 systemd[1]: Started sshd@26-172.31.31.172:22-147.75.109.163:34536.service - OpenSSH per-connection server daemon (147.75.109.163:34536). Sep 16 04:27:13.610866 containerd[2008]: time="2025-09-16T04:27:13.610811796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" id:\"e796895fc2e08e873859ef41ced1f59b8baa51da4518b2ce87b9bce63ef769c5\" pid:6462 exited_at:{seconds:1757996833 nanos:610352412}" Sep 16 04:27:13.729798 sshd[6458]: Accepted publickey for core from 147.75.109.163 port 34536 ssh2: RSA SHA256:Y5Y+ZnWP3WgTOoCj5PzODGK+AIEm1llgm/zgzIkJCqk Sep 16 04:27:13.734152 sshd-session[6458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:13.749280 systemd-logind[1971]: New session 27 of user core. Sep 16 04:27:13.756047 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 16 04:27:14.060410 sshd[6473]: Connection closed by 147.75.109.163 port 34536 Sep 16 04:27:14.059423 sshd-session[6458]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:14.067578 systemd[1]: sshd@26-172.31.31.172:22-147.75.109.163:34536.service: Deactivated successfully. Sep 16 04:27:14.075659 systemd[1]: session-27.scope: Deactivated successfully. Sep 16 04:27:14.079675 systemd-logind[1971]: Session 27 logged out. Waiting for processes to exit. Sep 16 04:27:14.085886 systemd-logind[1971]: Removed session 27. Sep 16 04:27:19.075257 containerd[2008]: time="2025-09-16T04:27:19.075156603Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" id:\"1108f5e70df57e4352942a7ae550d49f9bf73b9b5dcc010eb030321a809664d3\" pid:6500 exited_at:{seconds:1757996839 nanos:74551263}" Sep 16 04:27:26.230185 containerd[2008]: time="2025-09-16T04:27:26.230088550Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"bf3214c0c079983520765af4a7cfca22e8c7a8d1309bd352dc7e9e40c55e399f\" pid:6528 exited_at:{seconds:1757996846 nanos:229645282}" Sep 16 04:27:27.434441 systemd[1]: cri-containerd-8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e.scope: Deactivated successfully. Sep 16 04:27:27.435408 systemd[1]: cri-containerd-8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e.scope: Consumed 8.514s CPU time, 61.6M memory peak, 192K read from disk. Sep 16 04:27:27.445132 containerd[2008]: time="2025-09-16T04:27:27.444955033Z" level=info msg="received exit event container_id:\"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\" id:\"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\" pid:3393 exit_status:1 exited_at:{seconds:1757996847 nanos:442051980}" Sep 16 04:27:27.446601 containerd[2008]: time="2025-09-16T04:27:27.446284357Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\" id:\"8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e\" pid:3393 exit_status:1 exited_at:{seconds:1757996847 nanos:442051980}" Sep 16 04:27:27.505342 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e-rootfs.mount: Deactivated successfully. Sep 16 04:27:27.804542 systemd[1]: cri-containerd-16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c.scope: Deactivated successfully. Sep 16 04:27:27.805868 systemd[1]: cri-containerd-16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c.scope: Consumed 18.622s CPU time, 100.5M memory peak, 416K read from disk. Sep 16 04:27:27.810219 containerd[2008]: time="2025-09-16T04:27:27.810025838Z" level=info msg="received exit event container_id:\"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" id:\"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" pid:3874 exit_status:1 exited_at:{seconds:1757996847 nanos:809319446}" Sep 16 04:27:27.810219 containerd[2008]: time="2025-09-16T04:27:27.810160910Z" level=info msg="TaskExit event in podsandbox handler container_id:\"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" id:\"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" pid:3874 exit_status:1 exited_at:{seconds:1757996847 nanos:809319446}" Sep 16 04:27:27.852576 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c-rootfs.mount: Deactivated successfully. Sep 16 04:27:28.342457 kubelet[3544]: E0916 04:27:28.342378 3544 controller.go:195] "Failed to update lease" err="Put \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": context deadline exceeded" Sep 16 04:27:28.350829 kubelet[3544]: I0916 04:27:28.350731 3544 scope.go:117] "RemoveContainer" containerID="16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c" Sep 16 04:27:28.356673 kubelet[3544]: I0916 04:27:28.355563 3544 scope.go:117] "RemoveContainer" containerID="8a6b4fd4732eb63bebe8c6c6da2edca8ccb147f38f6f9081c2b2a92124c0ef9e" Sep 16 04:27:28.358404 containerd[2008]: time="2025-09-16T04:27:28.358353589Z" level=info msg="CreateContainer within sandbox \"5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 16 04:27:28.361359 containerd[2008]: time="2025-09-16T04:27:28.361245961Z" level=info msg="CreateContainer within sandbox \"9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 16 04:27:28.387129 containerd[2008]: time="2025-09-16T04:27:28.387075229Z" level=info msg="Container ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:27:28.398282 containerd[2008]: time="2025-09-16T04:27:28.398202625Z" level=info msg="Container bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:27:28.409413 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount960508505.mount: Deactivated successfully. Sep 16 04:27:28.413914 containerd[2008]: time="2025-09-16T04:27:28.413557849Z" level=info msg="CreateContainer within sandbox \"5f5f4ca5b7d764ecf0fe27a9c6eca9d9e7de565bd3d6f2041e55dab911c11efa\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\"" Sep 16 04:27:28.415797 containerd[2008]: time="2025-09-16T04:27:28.414817501Z" level=info msg="StartContainer for \"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\"" Sep 16 04:27:28.416631 containerd[2008]: time="2025-09-16T04:27:28.416568205Z" level=info msg="connecting to shim ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad" address="unix:///run/containerd/s/af8d23c113e79a050feba4a79fb00c9f170848f6bcd9f57921b49369f76ab553" protocol=ttrpc version=3 Sep 16 04:27:28.427426 containerd[2008]: time="2025-09-16T04:27:28.427350733Z" level=info msg="CreateContainer within sandbox \"9b8882c53705d66e47c29253ba7e8401054a2d033a952d2ee5514b48dbcf97b3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084\"" Sep 16 04:27:28.429294 containerd[2008]: time="2025-09-16T04:27:28.429193561Z" level=info msg="StartContainer for \"bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084\"" Sep 16 04:27:28.438882 containerd[2008]: time="2025-09-16T04:27:28.438780745Z" level=info msg="connecting to shim bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084" address="unix:///run/containerd/s/a38f1828e3eeb593f29a18d1bc2607f5f301360f19dab1e88971fc9dc9e3e7ca" protocol=ttrpc version=3 Sep 16 04:27:28.480062 systemd[1]: Started cri-containerd-ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad.scope - libcontainer container ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad. Sep 16 04:27:28.506037 systemd[1]: Started cri-containerd-bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084.scope - libcontainer container bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084. Sep 16 04:27:28.609918 containerd[2008]: time="2025-09-16T04:27:28.608506682Z" level=info msg="StartContainer for \"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\" returns successfully" Sep 16 04:27:28.640493 containerd[2008]: time="2025-09-16T04:27:28.640423334Z" level=info msg="StartContainer for \"bdd8e9ea37ec680980efe274a4a47440580b8c4e28cea0431cf46c4c3cae6084\" returns successfully" Sep 16 04:27:31.912538 containerd[2008]: time="2025-09-16T04:27:31.912473443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c2dd1ffe07ff502bbf2bca1537755ce4be947d9ad40591d742928f2e2da2cd6\" id:\"544f049d63aa1ea0f2db1aad3769e63d7c7c2137544e86f0742b311e7fb2bd52\" pid:6640 exited_at:{seconds:1757996851 nanos:911909947}" Sep 16 04:27:33.387824 systemd[1]: cri-containerd-c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9.scope: Deactivated successfully. Sep 16 04:27:33.389229 systemd[1]: cri-containerd-c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9.scope: Consumed 7.247s CPU time, 23M memory peak, 128K read from disk. Sep 16 04:27:33.395465 containerd[2008]: time="2025-09-16T04:27:33.395405682Z" level=info msg="received exit event container_id:\"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\" id:\"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\" pid:3378 exit_status:1 exited_at:{seconds:1757996853 nanos:395028594}" Sep 16 04:27:33.396495 containerd[2008]: time="2025-09-16T04:27:33.396446490Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\" id:\"c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9\" pid:3378 exit_status:1 exited_at:{seconds:1757996853 nanos:395028594}" Sep 16 04:27:33.440507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9-rootfs.mount: Deactivated successfully. Sep 16 04:27:34.400669 kubelet[3544]: I0916 04:27:34.400254 3544 scope.go:117] "RemoveContainer" containerID="c44b00e7d7f4c1fbe2d32dce99ea8966a3d1fa289454fa74b81f6dafa81d91f9" Sep 16 04:27:34.405209 containerd[2008]: time="2025-09-16T04:27:34.405153595Z" level=info msg="CreateContainer within sandbox \"fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 16 04:27:34.421792 containerd[2008]: time="2025-09-16T04:27:34.421033303Z" level=info msg="Container 4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:27:34.431792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2593580963.mount: Deactivated successfully. Sep 16 04:27:34.440137 containerd[2008]: time="2025-09-16T04:27:34.440059147Z" level=info msg="CreateContainer within sandbox \"fb35632220b685535bb51d3796431eb1a6803bb76a596774cfd7e6a1b369505e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61\"" Sep 16 04:27:34.440896 containerd[2008]: time="2025-09-16T04:27:34.440819623Z" level=info msg="StartContainer for \"4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61\"" Sep 16 04:27:34.443182 containerd[2008]: time="2025-09-16T04:27:34.443065003Z" level=info msg="connecting to shim 4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61" address="unix:///run/containerd/s/9a0a14fe64a51a073fc86104af322c224b6f6fdc553e6164700458c674a19cb4" protocol=ttrpc version=3 Sep 16 04:27:34.487045 systemd[1]: Started cri-containerd-4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61.scope - libcontainer container 4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61. Sep 16 04:27:34.573328 containerd[2008]: time="2025-09-16T04:27:34.573228872Z" level=info msg="StartContainer for \"4381e11a2a248a3d17ab132417f7b7d0fb1eb89901c8636de4ed1637c42acc61\" returns successfully" Sep 16 04:27:37.387483 containerd[2008]: time="2025-09-16T04:27:37.387405622Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d13dd767afcb09ac93ec4dd76277f9135133a3a19cde1cce80bdf8305d31176a\" id:\"4471aad67599b548e9b7bd4dda17e15d0f575c38568cb732729ee6fa27530039\" pid:6732 exited_at:{seconds:1757996857 nanos:386474182}" Sep 16 04:27:38.343731 kubelet[3544]: E0916 04:27:38.343604 3544 controller.go:195] "Failed to update lease" err="Put \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 16 04:27:40.090258 systemd[1]: cri-containerd-ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad.scope: Deactivated successfully. Sep 16 04:27:40.092822 containerd[2008]: time="2025-09-16T04:27:40.092666495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\" id:\"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\" pid:6590 exit_status:1 exited_at:{seconds:1757996860 nanos:92207207}" Sep 16 04:27:40.094310 containerd[2008]: time="2025-09-16T04:27:40.094018247Z" level=info msg="received exit event container_id:\"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\" id:\"ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad\" pid:6590 exit_status:1 exited_at:{seconds:1757996860 nanos:92207207}" Sep 16 04:27:40.137323 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad-rootfs.mount: Deactivated successfully. Sep 16 04:27:40.428602 kubelet[3544]: I0916 04:27:40.428159 3544 scope.go:117] "RemoveContainer" containerID="16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c" Sep 16 04:27:40.429693 kubelet[3544]: I0916 04:27:40.428773 3544 scope.go:117] "RemoveContainer" containerID="ae0fcd4ca04e5a9bb2522610668cf2ca92db1caa200311f43a1da000fe5678ad" Sep 16 04:27:40.429693 kubelet[3544]: E0916 04:27:40.429009 3544 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-755d956888-qrdps_tigera-operator(8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b)\"" pod="tigera-operator/tigera-operator-755d956888-qrdps" podUID="8ba6fd8a-cbf5-4417-a4f1-4265afc68c4b" Sep 16 04:27:40.433164 containerd[2008]: time="2025-09-16T04:27:40.433029121Z" level=info msg="RemoveContainer for \"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\"" Sep 16 04:27:40.443264 containerd[2008]: time="2025-09-16T04:27:40.443117149Z" level=info msg="RemoveContainer for \"16b2e49ffe2c8037a6d13c5e9ed195e54cb6c4114ae85344985e0beb2608971c\" returns successfully" Sep 16 04:27:48.345474 kubelet[3544]: E0916 04:27:48.345019 3544 controller.go:195] "Failed to update lease" err="Put \"https://172.31.31.172:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-172?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 16 04:27:49.072664 containerd[2008]: time="2025-09-16T04:27:49.072601652Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65299bd449ebfd81548bfb0d2e37175799606f0cdbf5f4de48e45a5a664b8331\" id:\"a7691eaa0fa4ddd912da6932d4e57c1ebc2c60545e4c8afa751ae5cf670b4651\" pid:6767 exit_status:1 exited_at:{seconds:1757996869 nanos:71888252}"