Sep 12 22:07:01.196443 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Sep 12 22:07:01.196496 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 20:38:46 -00 2025 Sep 12 22:07:01.196523 kernel: KASLR disabled due to lack of seed Sep 12 22:07:01.196540 kernel: efi: EFI v2.7 by EDK II Sep 12 22:07:01.196558 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a731a98 MEMRESERVE=0x78551598 Sep 12 22:07:01.196574 kernel: secureboot: Secure boot disabled Sep 12 22:07:01.196592 kernel: ACPI: Early table checksum verification disabled Sep 12 22:07:01.196607 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Sep 12 22:07:01.196624 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Sep 12 22:07:01.196640 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 12 22:07:01.196656 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Sep 12 22:07:01.196678 kernel: ACPI: FACS 0x0000000078630000 000040 Sep 12 22:07:01.196694 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 12 22:07:01.196710 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Sep 12 22:07:01.196728 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Sep 12 22:07:01.196744 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Sep 12 22:07:01.196767 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 12 22:07:01.196784 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Sep 12 22:07:01.196801 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Sep 12 22:07:01.196824 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Sep 12 22:07:01.196841 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Sep 12 22:07:01.196858 kernel: printk: legacy bootconsole [uart0] enabled Sep 12 22:07:01.196875 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 22:07:01.196892 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Sep 12 22:07:01.196910 kernel: NODE_DATA(0) allocated [mem 0x4b584ca00-0x4b5853fff] Sep 12 22:07:01.196926 kernel: Zone ranges: Sep 12 22:07:01.196988 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 12 22:07:01.197022 kernel: DMA32 empty Sep 12 22:07:01.197040 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Sep 12 22:07:01.197058 kernel: Device empty Sep 12 22:07:01.197074 kernel: Movable zone start for each node Sep 12 22:07:01.197090 kernel: Early memory node ranges Sep 12 22:07:01.197107 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Sep 12 22:07:01.197123 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Sep 12 22:07:01.197140 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Sep 12 22:07:01.197156 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Sep 12 22:07:01.197173 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Sep 12 22:07:01.197189 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Sep 12 22:07:01.197206 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Sep 12 22:07:01.197233 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Sep 12 22:07:01.197264 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Sep 12 22:07:01.197282 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Sep 12 22:07:01.197300 kernel: cma: Reserved 16 MiB at 0x000000007f000000 on node -1 Sep 12 22:07:01.197317 kernel: psci: probing for conduit method from ACPI. Sep 12 22:07:01.197340 kernel: psci: PSCIv1.0 detected in firmware. Sep 12 22:07:01.197375 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 22:07:01.197399 kernel: psci: Trusted OS migration not required Sep 12 22:07:01.197417 kernel: psci: SMC Calling Convention v1.1 Sep 12 22:07:01.197434 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Sep 12 22:07:01.197452 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 22:07:01.197469 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 22:07:01.197487 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 22:07:01.197505 kernel: Detected PIPT I-cache on CPU0 Sep 12 22:07:01.197522 kernel: CPU features: detected: GIC system register CPU interface Sep 12 22:07:01.197539 kernel: CPU features: detected: Spectre-v2 Sep 12 22:07:01.197567 kernel: CPU features: detected: Spectre-v3a Sep 12 22:07:01.197585 kernel: CPU features: detected: Spectre-BHB Sep 12 22:07:01.197602 kernel: CPU features: detected: ARM erratum 1742098 Sep 12 22:07:01.197619 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Sep 12 22:07:01.197636 kernel: alternatives: applying boot alternatives Sep 12 22:07:01.197655 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:07:01.197674 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:07:01.197692 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:07:01.197708 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:07:01.197726 kernel: Fallback order for Node 0: 0 Sep 12 22:07:01.197749 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1007616 Sep 12 22:07:01.197766 kernel: Policy zone: Normal Sep 12 22:07:01.197784 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:07:01.197801 kernel: software IO TLB: area num 2. Sep 12 22:07:01.197817 kernel: software IO TLB: mapped [mem 0x000000006c5f0000-0x00000000705f0000] (64MB) Sep 12 22:07:01.197835 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 22:07:01.197851 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:07:01.197870 kernel: rcu: RCU event tracing is enabled. Sep 12 22:07:01.197889 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 22:07:01.197906 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:07:01.197924 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:07:01.198604 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:07:01.198658 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 22:07:01.198676 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:07:01.198694 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:07:01.198711 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 22:07:01.198728 kernel: GICv3: 96 SPIs implemented Sep 12 22:07:01.198745 kernel: GICv3: 0 Extended SPIs implemented Sep 12 22:07:01.198762 kernel: Root IRQ handler: gic_handle_irq Sep 12 22:07:01.198779 kernel: GICv3: GICv3 features: 16 PPIs Sep 12 22:07:01.198796 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 12 22:07:01.198813 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Sep 12 22:07:01.198830 kernel: ITS [mem 0x10080000-0x1009ffff] Sep 12 22:07:01.198847 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000f0000 (indirect, esz 8, psz 64K, shr 1) Sep 12 22:07:01.198871 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @400100000 (flat, esz 8, psz 64K, shr 1) Sep 12 22:07:01.198888 kernel: GICv3: using LPI property table @0x0000000400110000 Sep 12 22:07:01.198906 kernel: ITS: Using hypervisor restricted LPI range [128] Sep 12 22:07:01.198923 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000400120000 Sep 12 22:07:01.198987 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:07:01.199010 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Sep 12 22:07:01.199027 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Sep 12 22:07:01.199044 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Sep 12 22:07:01.199061 kernel: Console: colour dummy device 80x25 Sep 12 22:07:01.199078 kernel: printk: legacy console [tty1] enabled Sep 12 22:07:01.199096 kernel: ACPI: Core revision 20240827 Sep 12 22:07:01.199121 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Sep 12 22:07:01.199139 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:07:01.199156 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:07:01.199173 kernel: landlock: Up and running. Sep 12 22:07:01.199190 kernel: SELinux: Initializing. Sep 12 22:07:01.199207 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:07:01.199224 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:07:01.199241 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:07:01.199259 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:07:01.199281 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:07:01.199298 kernel: Remapping and enabling EFI services. Sep 12 22:07:01.199315 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:07:01.199332 kernel: Detected PIPT I-cache on CPU1 Sep 12 22:07:01.199349 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Sep 12 22:07:01.199366 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000400130000 Sep 12 22:07:01.199383 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Sep 12 22:07:01.199401 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 22:07:01.199418 kernel: SMP: Total of 2 processors activated. Sep 12 22:07:01.199448 kernel: CPU: All CPU(s) started at EL1 Sep 12 22:07:01.199467 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 22:07:01.199489 kernel: CPU features: detected: 32-bit EL1 Support Sep 12 22:07:01.199507 kernel: CPU features: detected: CRC32 instructions Sep 12 22:07:01.199525 kernel: alternatives: applying system-wide alternatives Sep 12 22:07:01.199544 kernel: Memory: 3797032K/4030464K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 212088K reserved, 16384K cma-reserved) Sep 12 22:07:01.199563 kernel: devtmpfs: initialized Sep 12 22:07:01.199587 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:07:01.199605 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 22:07:01.199624 kernel: 17040 pages in range for non-PLT usage Sep 12 22:07:01.199643 kernel: 508560 pages in range for PLT usage Sep 12 22:07:01.199661 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:07:01.199680 kernel: SMBIOS 3.0.0 present. Sep 12 22:07:01.199698 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Sep 12 22:07:01.199716 kernel: DMI: Memory slots populated: 0/0 Sep 12 22:07:01.199734 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:07:01.199758 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 22:07:01.199777 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 22:07:01.199796 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 22:07:01.199814 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:07:01.199832 kernel: audit: type=2000 audit(0.228:1): state=initialized audit_enabled=0 res=1 Sep 12 22:07:01.199850 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:07:01.199868 kernel: cpuidle: using governor menu Sep 12 22:07:01.199886 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 22:07:01.199904 kernel: ASID allocator initialised with 65536 entries Sep 12 22:07:01.199927 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:07:01.199976 kernel: Serial: AMBA PL011 UART driver Sep 12 22:07:01.199998 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:07:01.200016 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:07:01.200034 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 22:07:01.200052 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 22:07:01.200070 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:07:01.200087 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:07:01.200105 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 22:07:01.200131 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 22:07:01.200149 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:07:01.200167 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:07:01.200185 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:07:01.200203 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:07:01.200221 kernel: ACPI: Interpreter enabled Sep 12 22:07:01.200239 kernel: ACPI: Using GIC for interrupt routing Sep 12 22:07:01.200256 kernel: ACPI: MCFG table detected, 1 entries Sep 12 22:07:01.200274 kernel: ACPI: CPU0 has been hot-added Sep 12 22:07:01.200297 kernel: ACPI: CPU1 has been hot-added Sep 12 22:07:01.200315 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Sep 12 22:07:01.200648 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:07:01.200863 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 22:07:01.201174 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 22:07:01.201403 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Sep 12 22:07:01.201612 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Sep 12 22:07:01.201654 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Sep 12 22:07:01.201674 kernel: acpiphp: Slot [1] registered Sep 12 22:07:01.201693 kernel: acpiphp: Slot [2] registered Sep 12 22:07:01.201711 kernel: acpiphp: Slot [3] registered Sep 12 22:07:01.201729 kernel: acpiphp: Slot [4] registered Sep 12 22:07:01.201748 kernel: acpiphp: Slot [5] registered Sep 12 22:07:01.201766 kernel: acpiphp: Slot [6] registered Sep 12 22:07:01.201786 kernel: acpiphp: Slot [7] registered Sep 12 22:07:01.201805 kernel: acpiphp: Slot [8] registered Sep 12 22:07:01.201823 kernel: acpiphp: Slot [9] registered Sep 12 22:07:01.201848 kernel: acpiphp: Slot [10] registered Sep 12 22:07:01.201866 kernel: acpiphp: Slot [11] registered Sep 12 22:07:01.201884 kernel: acpiphp: Slot [12] registered Sep 12 22:07:01.201902 kernel: acpiphp: Slot [13] registered Sep 12 22:07:01.201920 kernel: acpiphp: Slot [14] registered Sep 12 22:07:01.201989 kernel: acpiphp: Slot [15] registered Sep 12 22:07:01.202016 kernel: acpiphp: Slot [16] registered Sep 12 22:07:01.202035 kernel: acpiphp: Slot [17] registered Sep 12 22:07:01.202053 kernel: acpiphp: Slot [18] registered Sep 12 22:07:01.202080 kernel: acpiphp: Slot [19] registered Sep 12 22:07:01.202099 kernel: acpiphp: Slot [20] registered Sep 12 22:07:01.202118 kernel: acpiphp: Slot [21] registered Sep 12 22:07:01.202136 kernel: acpiphp: Slot [22] registered Sep 12 22:07:01.202155 kernel: acpiphp: Slot [23] registered Sep 12 22:07:01.202174 kernel: acpiphp: Slot [24] registered Sep 12 22:07:01.202192 kernel: acpiphp: Slot [25] registered Sep 12 22:07:01.202210 kernel: acpiphp: Slot [26] registered Sep 12 22:07:01.202228 kernel: acpiphp: Slot [27] registered Sep 12 22:07:01.202246 kernel: acpiphp: Slot [28] registered Sep 12 22:07:01.202270 kernel: acpiphp: Slot [29] registered Sep 12 22:07:01.202288 kernel: acpiphp: Slot [30] registered Sep 12 22:07:01.202307 kernel: acpiphp: Slot [31] registered Sep 12 22:07:01.202325 kernel: PCI host bridge to bus 0000:00 Sep 12 22:07:01.202576 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Sep 12 22:07:01.202775 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 22:07:01.202991 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Sep 12 22:07:01.203185 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Sep 12 22:07:01.203462 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:07:01.203727 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 conventional PCI endpoint Sep 12 22:07:01.204006 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff] Sep 12 22:07:01.204245 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Root Complex Integrated Endpoint Sep 12 22:07:01.204458 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80114000-0x80117fff] Sep 12 22:07:01.204664 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 22:07:01.204920 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Root Complex Integrated Endpoint Sep 12 22:07:01.205178 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80110000-0x80113fff] Sep 12 22:07:01.205414 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref] Sep 12 22:07:01.205702 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff] Sep 12 22:07:01.205972 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 12 22:07:01.206219 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref]: assigned Sep 12 22:07:01.206443 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff]: assigned Sep 12 22:07:01.206697 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80110000-0x80113fff]: assigned Sep 12 22:07:01.206914 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80114000-0x80117fff]: assigned Sep 12 22:07:01.207194 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff]: assigned Sep 12 22:07:01.207404 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Sep 12 22:07:01.207592 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 22:07:01.207788 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Sep 12 22:07:01.207818 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 22:07:01.207854 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 22:07:01.207875 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 22:07:01.207894 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 22:07:01.207912 kernel: iommu: Default domain type: Translated Sep 12 22:07:01.207930 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 22:07:01.207990 kernel: efivars: Registered efivars operations Sep 12 22:07:01.208012 kernel: vgaarb: loaded Sep 12 22:07:01.208032 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 22:07:01.208051 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:07:01.208079 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:07:01.208097 kernel: pnp: PnP ACPI init Sep 12 22:07:01.208359 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Sep 12 22:07:01.208395 kernel: pnp: PnP ACPI: found 1 devices Sep 12 22:07:01.208415 kernel: NET: Registered PF_INET protocol family Sep 12 22:07:01.208434 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:07:01.208454 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 22:07:01.208473 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:07:01.208502 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:07:01.208521 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 22:07:01.208539 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 22:07:01.208557 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:07:01.208575 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:07:01.208594 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:07:01.208612 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:07:01.208630 kernel: kvm [1]: HYP mode not available Sep 12 22:07:01.208647 kernel: Initialise system trusted keyrings Sep 12 22:07:01.208673 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 22:07:01.208691 kernel: Key type asymmetric registered Sep 12 22:07:01.208710 kernel: Asymmetric key parser 'x509' registered Sep 12 22:07:01.208729 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 22:07:01.208794 kernel: io scheduler mq-deadline registered Sep 12 22:07:01.208814 kernel: io scheduler kyber registered Sep 12 22:07:01.208832 kernel: io scheduler bfq registered Sep 12 22:07:01.209223 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Sep 12 22:07:01.209274 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 22:07:01.209294 kernel: ACPI: button: Power Button [PWRB] Sep 12 22:07:01.209313 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Sep 12 22:07:01.209331 kernel: ACPI: button: Sleep Button [SLPB] Sep 12 22:07:01.209350 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:07:01.209392 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 12 22:07:01.209641 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Sep 12 22:07:01.209674 kernel: printk: legacy console [ttyS0] disabled Sep 12 22:07:01.209694 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Sep 12 22:07:01.209723 kernel: printk: legacy console [ttyS0] enabled Sep 12 22:07:01.209742 kernel: printk: legacy bootconsole [uart0] disabled Sep 12 22:07:01.209761 kernel: thunder_xcv, ver 1.0 Sep 12 22:07:01.209779 kernel: thunder_bgx, ver 1.0 Sep 12 22:07:01.209797 kernel: nicpf, ver 1.0 Sep 12 22:07:01.209816 kernel: nicvf, ver 1.0 Sep 12 22:07:01.210115 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 22:07:01.210332 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T22:07:00 UTC (1757714820) Sep 12 22:07:01.210373 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:07:01.210393 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 (0,80000003) counters available Sep 12 22:07:01.210411 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:07:01.210430 kernel: watchdog: NMI not fully supported Sep 12 22:07:01.210448 kernel: watchdog: Hard watchdog permanently disabled Sep 12 22:07:01.210466 kernel: Segment Routing with IPv6 Sep 12 22:07:01.210485 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:07:01.210503 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:07:01.210521 kernel: Key type dns_resolver registered Sep 12 22:07:01.210545 kernel: registered taskstats version 1 Sep 12 22:07:01.210564 kernel: Loading compiled-in X.509 certificates Sep 12 22:07:01.210583 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 2d7730e6d35b3fbd1c590cd72a2500b2380c020e' Sep 12 22:07:01.210601 kernel: Demotion targets for Node 0: null Sep 12 22:07:01.210619 kernel: Key type .fscrypt registered Sep 12 22:07:01.210638 kernel: Key type fscrypt-provisioning registered Sep 12 22:07:01.210655 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:07:01.210674 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:07:01.210692 kernel: ima: No architecture policies found Sep 12 22:07:01.210717 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 22:07:01.210735 kernel: clk: Disabling unused clocks Sep 12 22:07:01.210753 kernel: PM: genpd: Disabling unused power domains Sep 12 22:07:01.210771 kernel: Warning: unable to open an initial console. Sep 12 22:07:01.210789 kernel: Freeing unused kernel memory: 38976K Sep 12 22:07:01.210807 kernel: Run /init as init process Sep 12 22:07:01.210826 kernel: with arguments: Sep 12 22:07:01.210843 kernel: /init Sep 12 22:07:01.210861 kernel: with environment: Sep 12 22:07:01.210879 kernel: HOME=/ Sep 12 22:07:01.210905 kernel: TERM=linux Sep 12 22:07:01.210923 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:07:01.210981 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:07:01.211015 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:07:01.211036 systemd[1]: Detected virtualization amazon. Sep 12 22:07:01.211056 systemd[1]: Detected architecture arm64. Sep 12 22:07:01.211075 systemd[1]: Running in initrd. Sep 12 22:07:01.211104 systemd[1]: No hostname configured, using default hostname. Sep 12 22:07:01.211126 systemd[1]: Hostname set to . Sep 12 22:07:01.211146 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:07:01.211167 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:07:01.211188 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:07:01.211210 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:07:01.211233 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:07:01.211254 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:07:01.211282 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:07:01.211304 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:07:01.211327 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:07:01.211349 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:07:01.211370 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:07:01.211391 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:07:01.211411 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:07:01.211438 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:07:01.211458 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:07:01.211478 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:07:01.211499 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:07:01.211519 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:07:01.211539 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:07:01.211560 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:07:01.211580 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:07:01.211607 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:07:01.211629 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:07:01.211649 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:07:01.211669 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:07:01.211688 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:07:01.211709 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:07:01.211729 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:07:01.211749 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:07:01.211770 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:07:01.211796 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:07:01.211816 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:07:01.211835 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:07:01.211856 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:07:01.211880 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:07:01.211901 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:07:01.211921 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:07:01.211994 kernel: Bridge firewalling registered Sep 12 22:07:01.212033 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:07:01.212134 systemd-journald[258]: Collecting audit messages is disabled. Sep 12 22:07:01.212190 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:07:01.212212 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:07:01.212234 systemd-journald[258]: Journal started Sep 12 22:07:01.212277 systemd-journald[258]: Runtime Journal (/run/log/journal/ec279d6b2cc41fd7b6b4da304b9e6489) is 8M, max 75.3M, 67.3M free. Sep 12 22:07:01.153484 systemd-modules-load[259]: Inserted module 'overlay' Sep 12 22:07:01.224334 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:07:01.184347 systemd-modules-load[259]: Inserted module 'br_netfilter' Sep 12 22:07:01.234731 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:07:01.247223 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:07:01.268113 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:07:01.280645 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:07:01.302460 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:07:01.318214 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:07:01.324893 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:07:01.332639 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:07:01.346548 systemd-tmpfiles[281]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:07:01.359556 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:07:01.371426 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:07:01.400459 dracut-cmdline[295]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:07:01.475405 systemd-resolved[302]: Positive Trust Anchors: Sep 12 22:07:01.475446 systemd-resolved[302]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:07:01.475510 systemd-resolved[302]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:07:01.588992 kernel: SCSI subsystem initialized Sep 12 22:07:01.596985 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:07:01.611201 kernel: iscsi: registered transport (tcp) Sep 12 22:07:01.632985 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:07:01.633061 kernel: QLogic iSCSI HBA Driver Sep 12 22:07:01.668125 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:07:01.712044 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:07:01.724855 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:07:01.738991 kernel: random: crng init done Sep 12 22:07:01.739290 systemd-resolved[302]: Defaulting to hostname 'linux'. Sep 12 22:07:01.743648 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:07:01.748377 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:07:01.816725 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:07:01.823318 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:07:01.909037 kernel: raid6: neonx8 gen() 6435 MB/s Sep 12 22:07:01.926006 kernel: raid6: neonx4 gen() 6396 MB/s Sep 12 22:07:01.942989 kernel: raid6: neonx2 gen() 5377 MB/s Sep 12 22:07:01.959998 kernel: raid6: neonx1 gen() 3921 MB/s Sep 12 22:07:01.976995 kernel: raid6: int64x8 gen() 3575 MB/s Sep 12 22:07:01.994010 kernel: raid6: int64x4 gen() 3695 MB/s Sep 12 22:07:02.010998 kernel: raid6: int64x2 gen() 3539 MB/s Sep 12 22:07:02.029153 kernel: raid6: int64x1 gen() 2755 MB/s Sep 12 22:07:02.029224 kernel: raid6: using algorithm neonx8 gen() 6435 MB/s Sep 12 22:07:02.048039 kernel: raid6: .... xor() 4708 MB/s, rmw enabled Sep 12 22:07:02.048126 kernel: raid6: using neon recovery algorithm Sep 12 22:07:02.057826 kernel: xor: measuring software checksum speed Sep 12 22:07:02.057905 kernel: 8regs : 12979 MB/sec Sep 12 22:07:02.060356 kernel: 32regs : 11994 MB/sec Sep 12 22:07:02.060434 kernel: arm64_neon : 9083 MB/sec Sep 12 22:07:02.060472 kernel: xor: using function: 8regs (12979 MB/sec) Sep 12 22:07:02.159010 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:07:02.170695 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:07:02.178411 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:07:02.225846 systemd-udevd[507]: Using default interface naming scheme 'v255'. Sep 12 22:07:02.238616 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:07:02.251396 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:07:02.296781 dracut-pre-trigger[512]: rd.md=0: removing MD RAID activation Sep 12 22:07:02.351260 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:07:02.359235 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:07:02.504658 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:07:02.513823 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:07:02.718300 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 22:07:02.718404 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 12 22:07:02.719624 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 12 22:07:02.725409 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Sep 12 22:07:02.729613 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 12 22:07:02.730050 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 12 22:07:02.730769 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:07:02.740249 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 12 22:07:02.731332 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:07:02.737149 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:07:02.747213 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:07:02.769179 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:af:a7:36:87:2d Sep 12 22:07:02.769484 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:07:02.769512 kernel: GPT:9289727 != 16777215 Sep 12 22:07:02.769546 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:07:02.769573 kernel: GPT:9289727 != 16777215 Sep 12 22:07:02.769596 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:07:02.769621 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:07:02.768525 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:07:02.769786 (udev-worker)[560]: Network interface NamePolicy= disabled on kernel command line. Sep 12 22:07:02.812970 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:07:02.833014 kernel: nvme nvme0: using unchecked data buffer Sep 12 22:07:02.988390 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 12 22:07:03.046298 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 12 22:07:03.053086 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:07:03.096779 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 12 22:07:03.104056 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 12 22:07:03.133186 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 12 22:07:03.136149 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:07:03.139446 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:07:03.147558 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:07:03.155579 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:07:03.162474 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:07:03.193023 disk-uuid[685]: Primary Header is updated. Sep 12 22:07:03.193023 disk-uuid[685]: Secondary Entries is updated. Sep 12 22:07:03.193023 disk-uuid[685]: Secondary Header is updated. Sep 12 22:07:03.204028 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:07:03.218688 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:07:04.224011 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:07:04.226372 disk-uuid[687]: The operation has completed successfully. Sep 12 22:07:04.430801 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:07:04.433447 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:07:04.517356 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:07:04.532496 sh[954]: Success Sep 12 22:07:04.555268 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:07:04.555357 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:07:04.555383 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:07:04.571008 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 22:07:04.668633 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:07:04.677324 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:07:04.693740 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:07:04.723978 kernel: BTRFS: device fsid 254e43f1-b609-42b8-bcc5-437252095415 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (977) Sep 12 22:07:04.729196 kernel: BTRFS info (device dm-0): first mount of filesystem 254e43f1-b609-42b8-bcc5-437252095415 Sep 12 22:07:04.729280 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:07:04.836145 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 22:07:04.836235 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:07:04.836262 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:07:04.861306 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:07:04.867962 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:07:04.873030 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:07:04.878650 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:07:04.885850 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:07:04.942001 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1010) Sep 12 22:07:04.947750 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:07:04.947821 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:07:04.957825 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 22:07:04.957897 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 22:07:04.967999 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:07:04.970377 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:07:04.977851 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:07:05.070937 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:07:05.079608 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:07:05.158781 systemd-networkd[1146]: lo: Link UP Sep 12 22:07:05.159288 systemd-networkd[1146]: lo: Gained carrier Sep 12 22:07:05.162724 systemd-networkd[1146]: Enumeration completed Sep 12 22:07:05.162880 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:07:05.163811 systemd-networkd[1146]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:07:05.163818 systemd-networkd[1146]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:07:05.172225 systemd[1]: Reached target network.target - Network. Sep 12 22:07:05.183229 systemd-networkd[1146]: eth0: Link UP Sep 12 22:07:05.183236 systemd-networkd[1146]: eth0: Gained carrier Sep 12 22:07:05.183258 systemd-networkd[1146]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:07:05.203093 systemd-networkd[1146]: eth0: DHCPv4 address 172.31.25.121/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 12 22:07:05.631990 ignition[1066]: Ignition 2.22.0 Sep 12 22:07:05.632016 ignition[1066]: Stage: fetch-offline Sep 12 22:07:05.632921 ignition[1066]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:05.633545 ignition[1066]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:05.634493 ignition[1066]: Ignition finished successfully Sep 12 22:07:05.645771 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:07:05.650732 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 22:07:05.702867 ignition[1157]: Ignition 2.22.0 Sep 12 22:07:05.703406 ignition[1157]: Stage: fetch Sep 12 22:07:05.704272 ignition[1157]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:05.704296 ignition[1157]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:05.704704 ignition[1157]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:05.717816 ignition[1157]: PUT result: OK Sep 12 22:07:05.723163 ignition[1157]: parsed url from cmdline: "" Sep 12 22:07:05.723312 ignition[1157]: no config URL provided Sep 12 22:07:05.723331 ignition[1157]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:07:05.723357 ignition[1157]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:07:05.723556 ignition[1157]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:05.733546 ignition[1157]: PUT result: OK Sep 12 22:07:05.733860 ignition[1157]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 12 22:07:05.738021 ignition[1157]: GET result: OK Sep 12 22:07:05.738356 ignition[1157]: parsing config with SHA512: d4879fa3adb7a61e8033b5b61a302c650c310d34b33566e2b06843e2f4ee1f3d9ef5c0350cd56c1e4c0bc9a6fb68b515555ad8f394c65343c71ed55dfd512b1c Sep 12 22:07:05.746917 unknown[1157]: fetched base config from "system" Sep 12 22:07:05.747186 unknown[1157]: fetched base config from "system" Sep 12 22:07:05.747200 unknown[1157]: fetched user config from "aws" Sep 12 22:07:05.751913 ignition[1157]: fetch: fetch complete Sep 12 22:07:05.751927 ignition[1157]: fetch: fetch passed Sep 12 22:07:05.753341 ignition[1157]: Ignition finished successfully Sep 12 22:07:05.761351 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 22:07:05.767620 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:07:05.823295 ignition[1163]: Ignition 2.22.0 Sep 12 22:07:05.823798 ignition[1163]: Stage: kargs Sep 12 22:07:05.824362 ignition[1163]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:05.824385 ignition[1163]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:05.824535 ignition[1163]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:05.837562 ignition[1163]: PUT result: OK Sep 12 22:07:05.847499 ignition[1163]: kargs: kargs passed Sep 12 22:07:05.847624 ignition[1163]: Ignition finished successfully Sep 12 22:07:05.856012 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:07:05.861906 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:07:05.913728 ignition[1169]: Ignition 2.22.0 Sep 12 22:07:05.913996 ignition[1169]: Stage: disks Sep 12 22:07:05.914497 ignition[1169]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:05.914519 ignition[1169]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:05.914661 ignition[1169]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:05.919795 ignition[1169]: PUT result: OK Sep 12 22:07:05.929392 ignition[1169]: disks: disks passed Sep 12 22:07:05.929703 ignition[1169]: Ignition finished successfully Sep 12 22:07:05.936312 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:07:05.940808 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:07:05.943544 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:07:05.946397 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:07:05.950863 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:07:05.955568 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:07:05.964116 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:07:06.027695 systemd-fsck[1177]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 22:07:06.033776 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:07:06.043654 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:07:06.178966 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a7b592ec-3c41-4dc2-88a7-056c1f18b418 r/w with ordered data mode. Quota mode: none. Sep 12 22:07:06.180619 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:07:06.184701 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:07:06.191416 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:07:06.206430 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:07:06.213977 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 22:07:06.214081 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:07:06.214133 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:07:06.237598 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:07:06.242870 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:07:06.251990 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1196) Sep 12 22:07:06.257694 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:07:06.257753 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:07:06.267616 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 22:07:06.268030 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 22:07:06.270825 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:07:06.596503 initrd-setup-root[1220]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:07:06.629701 initrd-setup-root[1227]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:07:06.634256 systemd-networkd[1146]: eth0: Gained IPv6LL Sep 12 22:07:06.641076 initrd-setup-root[1234]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:07:06.649995 initrd-setup-root[1241]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:07:06.917716 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:07:06.924004 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:07:06.931195 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:07:06.955721 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:07:06.958566 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:07:06.990751 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:07:07.012975 ignition[1309]: INFO : Ignition 2.22.0 Sep 12 22:07:07.012975 ignition[1309]: INFO : Stage: mount Sep 12 22:07:07.012975 ignition[1309]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:07.012975 ignition[1309]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:07.012975 ignition[1309]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:07.025170 ignition[1309]: INFO : PUT result: OK Sep 12 22:07:07.030327 ignition[1309]: INFO : mount: mount passed Sep 12 22:07:07.030327 ignition[1309]: INFO : Ignition finished successfully Sep 12 22:07:07.036661 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:07:07.042380 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:07:07.184240 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:07:07.228966 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1320) Sep 12 22:07:07.233299 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:07:07.233376 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:07:07.242233 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 22:07:07.242311 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 22:07:07.245504 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:07:07.301144 ignition[1337]: INFO : Ignition 2.22.0 Sep 12 22:07:07.301144 ignition[1337]: INFO : Stage: files Sep 12 22:07:07.305305 ignition[1337]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:07.305305 ignition[1337]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:07.305305 ignition[1337]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:07.313071 ignition[1337]: INFO : PUT result: OK Sep 12 22:07:07.317644 ignition[1337]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:07:07.332301 ignition[1337]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:07:07.332301 ignition[1337]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:07:07.342664 ignition[1337]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:07:07.349565 ignition[1337]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:07:07.353855 unknown[1337]: wrote ssh authorized keys file for user: core Sep 12 22:07:07.356415 ignition[1337]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:07:07.359275 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:07:07.359275 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 12 22:07:07.423003 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:07:07.847976 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:07:07.847976 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:07:07.847976 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:07:07.847976 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:07:07.863390 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:07:07.863390 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:07:07.863390 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:07:07.863390 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:07:07.863390 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:07:07.886277 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:07:07.890777 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:07:07.890777 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:07:07.890777 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:07:07.890777 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:07:07.890777 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 12 22:07:08.317281 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:07:08.670757 ignition[1337]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:07:08.670757 ignition[1337]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:07:08.679546 ignition[1337]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:07:08.679546 ignition[1337]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:07:08.679546 ignition[1337]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:07:08.679546 ignition[1337]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:07:08.679546 ignition[1337]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:07:08.702409 ignition[1337]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:07:08.702409 ignition[1337]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:07:08.702409 ignition[1337]: INFO : files: files passed Sep 12 22:07:08.702409 ignition[1337]: INFO : Ignition finished successfully Sep 12 22:07:08.706708 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:07:08.715170 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:07:08.739550 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:07:08.754498 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:07:08.756429 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:07:08.783466 initrd-setup-root-after-ignition[1367]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:07:08.783466 initrd-setup-root-after-ignition[1367]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:07:08.790537 initrd-setup-root-after-ignition[1371]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:07:08.799401 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:07:08.806134 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:07:08.814194 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:07:08.903187 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:07:08.904957 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:07:08.912065 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:07:08.916573 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:07:08.921001 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:07:08.924151 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:07:08.979265 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:07:08.986462 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:07:09.027498 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:07:09.028294 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:07:09.028626 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:07:09.028967 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:07:09.029199 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:07:09.031618 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:07:09.031997 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:07:09.032337 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:07:09.032694 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:07:09.033076 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:07:09.033433 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:07:09.033786 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:07:09.034513 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:07:09.034901 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:07:09.035618 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:07:09.036000 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:07:09.036296 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:07:09.036511 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:07:09.037105 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:07:09.040740 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:07:09.044907 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:07:09.078314 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:07:09.081138 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:07:09.081388 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:07:09.088302 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:07:09.088606 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:07:09.096838 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:07:09.097099 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:07:09.104587 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:07:09.125898 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:07:09.131740 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:07:09.134159 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:07:09.151539 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:07:09.151760 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:07:09.184398 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:07:09.190252 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:07:09.205225 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:07:09.210891 ignition[1391]: INFO : Ignition 2.22.0 Sep 12 22:07:09.210891 ignition[1391]: INFO : Stage: umount Sep 12 22:07:09.215481 ignition[1391]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:07:09.215481 ignition[1391]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 22:07:09.215481 ignition[1391]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 22:07:09.228860 ignition[1391]: INFO : PUT result: OK Sep 12 22:07:09.220793 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:07:09.221001 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:07:09.237594 ignition[1391]: INFO : umount: umount passed Sep 12 22:07:09.239428 ignition[1391]: INFO : Ignition finished successfully Sep 12 22:07:09.244276 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:07:09.244503 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:07:09.249446 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:07:09.249544 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:07:09.255922 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:07:09.256095 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:07:09.262355 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 22:07:09.262451 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 22:07:09.268399 systemd[1]: Stopped target network.target - Network. Sep 12 22:07:09.270482 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:07:09.270592 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:07:09.277738 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:07:09.280201 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:07:09.281484 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:07:09.285056 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:07:09.285505 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:07:09.285910 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:07:09.286017 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:07:09.286248 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:07:09.286313 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:07:09.286583 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:07:09.286677 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:07:09.286989 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:07:09.287065 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:07:09.287288 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:07:09.287365 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:07:09.287817 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:07:09.288435 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:07:09.319145 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:07:09.319504 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:07:09.341651 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:07:09.342873 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:07:09.343118 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:07:09.348905 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:07:09.350283 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:07:09.359814 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:07:09.359914 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:07:09.381269 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:07:09.390109 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:07:09.395635 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:07:09.410845 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:07:09.410977 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:07:09.419755 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:07:09.419855 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:07:09.426325 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:07:09.426425 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:07:09.431602 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:07:09.440761 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:07:09.440892 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:07:09.476334 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:07:09.476624 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:07:09.481345 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:07:09.481625 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:07:09.492291 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:07:09.492424 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:07:09.498555 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:07:09.498630 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:07:09.501407 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:07:09.501503 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:07:09.508368 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:07:09.508470 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:07:09.516175 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:07:09.516287 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:07:09.524436 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:07:09.534426 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:07:09.534558 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:07:09.541379 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:07:09.541490 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:07:09.554634 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 22:07:09.554743 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:07:09.563277 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:07:09.563379 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:07:09.566231 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:07:09.566326 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:07:09.576706 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 22:07:09.576824 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 22:07:09.576905 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 22:07:09.577021 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:07:09.602812 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:07:09.603162 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:07:09.608343 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:07:09.614105 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:07:09.654575 systemd[1]: Switching root. Sep 12 22:07:09.716736 systemd-journald[258]: Journal stopped Sep 12 22:07:12.228751 systemd-journald[258]: Received SIGTERM from PID 1 (systemd). Sep 12 22:07:12.228874 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:07:12.228914 kernel: SELinux: policy capability open_perms=1 Sep 12 22:07:12.229010 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:07:12.229062 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:07:12.229093 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:07:12.229122 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:07:12.238797 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:07:12.238850 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:07:12.238879 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:07:12.238907 kernel: audit: type=1403 audit(1757714830.269:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:07:12.238969 systemd[1]: Successfully loaded SELinux policy in 104.250ms. Sep 12 22:07:12.239020 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 15.300ms. Sep 12 22:07:12.239062 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:07:12.239095 systemd[1]: Detected virtualization amazon. Sep 12 22:07:12.239125 systemd[1]: Detected architecture arm64. Sep 12 22:07:12.239158 systemd[1]: Detected first boot. Sep 12 22:07:12.239190 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:07:12.239220 zram_generator::config[1435]: No configuration found. Sep 12 22:07:12.239251 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:07:12.239280 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:07:12.239311 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:07:12.239341 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:07:12.239372 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:07:12.239406 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:07:12.240027 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:07:12.240081 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:07:12.240112 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:07:12.240142 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:07:12.240170 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:07:12.240201 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:07:12.240231 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:07:12.241023 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:07:12.241066 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:07:12.241097 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:07:12.241127 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:07:12.241158 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:07:12.241186 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:07:12.241216 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:07:12.241246 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 22:07:12.241276 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:07:12.241329 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:07:12.241362 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:07:12.241392 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:07:12.241422 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:07:12.241451 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:07:12.241479 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:07:12.241508 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:07:12.241538 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:07:12.241572 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:07:12.241600 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:07:12.241631 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:07:12.241660 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:07:12.241687 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:07:12.241715 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:07:12.241743 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:07:12.241771 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:07:12.241800 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:07:12.241830 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:07:12.241861 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:07:12.241891 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:07:12.241921 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:07:12.242638 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:07:12.242682 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:07:12.242713 systemd[1]: Reached target machines.target - Containers. Sep 12 22:07:12.242741 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:07:12.242769 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:07:12.242805 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:07:12.242835 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:07:12.242866 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:07:12.242896 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:07:12.242924 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:07:12.244087 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:07:12.244129 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:07:12.244163 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:07:12.244200 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:07:12.246848 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:07:12.246886 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:07:12.246918 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:07:12.246969 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:07:12.247026 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:07:12.247057 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:07:12.247089 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:07:12.247118 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:07:12.247153 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:07:12.247182 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:07:12.247216 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:07:12.247247 systemd[1]: Stopped verity-setup.service. Sep 12 22:07:12.247274 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:07:12.247302 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:07:12.247329 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:07:12.247357 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:07:12.247388 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:07:12.247417 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:07:12.247449 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:07:12.247477 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:07:12.247505 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:07:12.247535 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:07:12.247562 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:07:12.247592 kernel: loop: module loaded Sep 12 22:07:12.247619 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:07:12.247648 kernel: fuse: init (API version 7.41) Sep 12 22:07:12.247674 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:07:12.247708 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:07:12.247736 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:07:12.247766 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:07:12.247794 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:07:12.247822 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:07:12.247902 systemd-journald[1514]: Collecting audit messages is disabled. Sep 12 22:07:12.248998 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:07:12.249035 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:07:12.249072 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:07:12.249100 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:07:12.249130 systemd-journald[1514]: Journal started Sep 12 22:07:12.249175 systemd-journald[1514]: Runtime Journal (/run/log/journal/ec279d6b2cc41fd7b6b4da304b9e6489) is 8M, max 75.3M, 67.3M free. Sep 12 22:07:11.619156 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:07:11.634900 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 12 22:07:11.635771 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:07:12.264446 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:07:12.281972 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:07:12.282065 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:07:12.282104 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:07:12.299989 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:07:12.308993 kernel: ACPI: bus type drm_connector registered Sep 12 22:07:12.321216 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:07:12.326013 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:07:12.337973 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:07:12.338078 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:07:12.353368 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:07:12.353457 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:07:12.362055 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:07:12.372988 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:07:12.385375 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:07:12.392260 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:07:12.395597 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:07:12.403249 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:07:12.406170 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:07:12.409274 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:07:12.459068 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:07:12.475316 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:07:12.493086 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:07:12.497352 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:07:12.503336 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:07:12.544824 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:07:12.548880 systemd-journald[1514]: Time spent on flushing to /var/log/journal/ec279d6b2cc41fd7b6b4da304b9e6489 is 58.517ms for 938 entries. Sep 12 22:07:12.548880 systemd-journald[1514]: System Journal (/var/log/journal/ec279d6b2cc41fd7b6b4da304b9e6489) is 8M, max 195.6M, 187.6M free. Sep 12 22:07:12.617987 kernel: loop0: detected capacity change from 0 to 119368 Sep 12 22:07:12.618063 systemd-journald[1514]: Received client request to flush runtime journal. Sep 12 22:07:12.601339 systemd-tmpfiles[1548]: ACLs are not supported, ignoring. Sep 12 22:07:12.601366 systemd-tmpfiles[1548]: ACLs are not supported, ignoring. Sep 12 22:07:12.624032 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:07:12.631841 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:07:12.636715 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:07:12.660390 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:07:12.664999 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:07:12.668197 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:07:12.694729 kernel: loop1: detected capacity change from 0 to 203944 Sep 12 22:07:12.740750 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:07:12.753003 kernel: loop2: detected capacity change from 0 to 61264 Sep 12 22:07:12.762023 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:07:12.772518 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:07:12.807994 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Sep 12 22:07:12.808345 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Sep 12 22:07:12.817001 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:07:12.876992 kernel: loop3: detected capacity change from 0 to 100632 Sep 12 22:07:13.009438 kernel: loop4: detected capacity change from 0 to 119368 Sep 12 22:07:13.021976 kernel: loop5: detected capacity change from 0 to 203944 Sep 12 22:07:13.050461 kernel: loop6: detected capacity change from 0 to 61264 Sep 12 22:07:13.063997 kernel: loop7: detected capacity change from 0 to 100632 Sep 12 22:07:13.073930 (sd-merge)[1596]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 12 22:07:13.074993 (sd-merge)[1596]: Merged extensions into '/usr'. Sep 12 22:07:13.083691 systemd[1]: Reload requested from client PID 1547 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:07:13.083878 systemd[1]: Reloading... Sep 12 22:07:13.268746 zram_generator::config[1630]: No configuration found. Sep 12 22:07:13.685400 systemd[1]: Reloading finished in 600 ms. Sep 12 22:07:13.720079 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:07:13.737243 systemd[1]: Starting ensure-sysext.service... Sep 12 22:07:13.741249 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:07:13.804540 systemd-tmpfiles[1674]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:07:13.804626 systemd-tmpfiles[1674]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:07:13.805438 systemd-tmpfiles[1674]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:07:13.806186 systemd-tmpfiles[1674]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:07:13.808353 systemd-tmpfiles[1674]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:07:13.809211 systemd-tmpfiles[1674]: ACLs are not supported, ignoring. Sep 12 22:07:13.809409 systemd-tmpfiles[1674]: ACLs are not supported, ignoring. Sep 12 22:07:13.815996 systemd[1]: Reload requested from client PID 1673 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:07:13.816036 systemd[1]: Reloading... Sep 12 22:07:13.840094 systemd-tmpfiles[1674]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:07:13.840126 systemd-tmpfiles[1674]: Skipping /boot Sep 12 22:07:13.859345 systemd-tmpfiles[1674]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:07:13.859374 systemd-tmpfiles[1674]: Skipping /boot Sep 12 22:07:13.973729 zram_generator::config[1702]: No configuration found. Sep 12 22:07:14.163348 ldconfig[1539]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:07:14.391258 systemd[1]: Reloading finished in 574 ms. Sep 12 22:07:14.424021 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:07:14.427616 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:07:14.447488 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:07:14.466646 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:07:14.472096 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:07:14.479432 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:07:14.491645 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:07:14.499477 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:07:14.507532 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:07:14.517064 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:07:14.522755 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:07:14.530701 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:07:14.539682 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:07:14.540966 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:07:14.541238 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:07:14.548893 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:07:14.550451 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:07:14.550688 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:07:14.560039 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:07:14.564355 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:07:14.565026 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:07:14.565320 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:07:14.565710 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:07:14.583095 systemd[1]: Finished ensure-sysext.service. Sep 12 22:07:14.593660 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:07:14.640019 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:07:14.642140 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:07:14.645655 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:07:14.647472 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:07:14.659421 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:07:14.663400 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:07:14.666817 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:07:14.667742 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:07:14.678329 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:07:14.681166 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:07:14.686446 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:07:14.686619 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:07:14.693242 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:07:14.740091 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:07:14.756549 systemd-udevd[1762]: Using default interface naming scheme 'v255'. Sep 12 22:07:14.806908 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:07:14.810758 augenrules[1798]: No rules Sep 12 22:07:14.816301 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:07:14.821070 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:07:14.825436 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:07:14.831397 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:07:14.842564 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:07:14.850143 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:07:15.104820 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 22:07:15.109202 (udev-worker)[1828]: Network interface NamePolicy= disabled on kernel command line. Sep 12 22:07:15.135882 systemd-resolved[1761]: Positive Trust Anchors: Sep 12 22:07:15.135921 systemd-resolved[1761]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:07:15.136009 systemd-resolved[1761]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:07:15.151372 systemd-resolved[1761]: Defaulting to hostname 'linux'. Sep 12 22:07:15.154563 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:07:15.157154 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:07:15.159804 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:07:15.162473 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:07:15.165409 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:07:15.168495 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:07:15.171100 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:07:15.173896 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:07:15.177012 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:07:15.177075 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:07:15.179306 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:07:15.183831 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:07:15.189203 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:07:15.199299 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:07:15.203327 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:07:15.206281 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:07:15.217109 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:07:15.221687 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:07:15.225491 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:07:15.228469 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:07:15.230936 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:07:15.233730 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:07:15.233789 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:07:15.240368 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 22:07:15.247382 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:07:15.254685 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:07:15.267278 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:07:15.275437 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:07:15.278111 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:07:15.283858 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:07:15.295389 systemd[1]: Started ntpd.service - Network Time Service. Sep 12 22:07:15.302712 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:07:15.317903 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 12 22:07:15.332426 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:07:15.339493 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:07:15.358012 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:07:15.368071 jq[1852]: false Sep 12 22:07:15.364183 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:07:15.366743 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:07:15.380515 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:07:15.394061 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:07:15.398630 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:07:15.401237 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:07:15.434067 jq[1865]: true Sep 12 22:07:15.436718 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:07:15.459564 jq[1870]: true Sep 12 22:07:15.553371 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:07:15.554430 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:07:15.574353 tar[1866]: linux-arm64/helm Sep 12 22:07:15.590504 bash[1889]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:07:15.595664 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:07:15.605847 systemd[1]: Starting sshkeys.service... Sep 12 22:07:15.633743 dbus-daemon[1850]: [system] SELinux support is enabled Sep 12 22:07:15.634155 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:07:15.642737 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:07:15.642827 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:07:15.645978 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:07:15.646023 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:07:15.658007 extend-filesystems[1853]: Found /dev/nvme0n1p6 Sep 12 22:07:15.699993 extend-filesystems[1853]: Found /dev/nvme0n1p9 Sep 12 22:07:15.699993 extend-filesystems[1853]: Checking size of /dev/nvme0n1p9 Sep 12 22:07:15.721892 coreos-metadata[1849]: Sep 12 22:07:15.721 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 12 22:07:15.735348 update_engine[1863]: I20250912 22:07:15.731508 1863 main.cc:92] Flatcar Update Engine starting Sep 12 22:07:15.752773 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:07:15.753461 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:07:15.764395 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:07:15.769654 update_engine[1863]: I20250912 22:07:15.768250 1863 update_check_scheduler.cc:74] Next update check in 5m7s Sep 12 22:07:15.770439 extend-filesystems[1853]: Resized partition /dev/nvme0n1p9 Sep 12 22:07:15.780992 extend-filesystems[1918]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:07:15.805240 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 12 22:07:15.810002 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:07:15.825177 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 22:07:15.832426 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 22:07:15.836231 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 12 22:07:15.845000 systemd-networkd[1811]: lo: Link UP Sep 12 22:07:15.845018 systemd-networkd[1811]: lo: Gained carrier Sep 12 22:07:15.867361 systemd-networkd[1811]: Enumeration completed Sep 12 22:07:15.867591 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:07:15.870803 systemd[1]: Reached target network.target - Network. Sep 12 22:07:15.882412 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:07:15.888558 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:07:15.894571 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:07:15.901067 systemd-networkd[1811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:07:15.901079 systemd-networkd[1811]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:07:15.957617 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 12 22:07:15.986766 extend-filesystems[1918]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 12 22:07:15.986766 extend-filesystems[1918]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 22:07:15.986766 extend-filesystems[1918]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 12 22:07:16.003657 extend-filesystems[1853]: Resized filesystem in /dev/nvme0n1p9 Sep 12 22:07:15.994058 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:07:15.994534 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:07:16.017341 systemd-networkd[1811]: eth0: Link UP Sep 12 22:07:16.021132 systemd-networkd[1811]: eth0: Gained carrier Sep 12 22:07:16.021188 systemd-networkd[1811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:07:16.053817 (ntainerd)[1931]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:07:16.067127 systemd-networkd[1811]: eth0: DHCPv4 address 172.31.25.121/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 12 22:07:16.067277 dbus-daemon[1850]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.3' (uid=244 pid=1811 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 12 22:07:16.076047 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 12 22:07:16.083967 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:07:16.271020 coreos-metadata[1921]: Sep 12 22:07:16.268 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 12 22:07:16.272330 coreos-metadata[1921]: Sep 12 22:07:16.271 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 12 22:07:16.274012 coreos-metadata[1921]: Sep 12 22:07:16.273 INFO Fetch successful Sep 12 22:07:16.274012 coreos-metadata[1921]: Sep 12 22:07:16.273 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 12 22:07:16.279463 coreos-metadata[1921]: Sep 12 22:07:16.278 INFO Fetch successful Sep 12 22:07:16.285645 unknown[1921]: wrote ssh authorized keys file for user: core Sep 12 22:07:16.426807 update-ssh-keys[1943]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:07:16.432521 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 22:07:16.439355 systemd[1]: Finished sshkeys.service. Sep 12 22:07:16.515291 ntpd[1855]: ntpd 4.2.8p18@1.4062-o Fri Sep 12 20:09:57 UTC 2025 (1): Starting Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: ntpd 4.2.8p18@1.4062-o Fri Sep 12 20:09:57 UTC 2025 (1): Starting Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: ---------------------------------------------------- Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: ntp-4 is maintained by Network Time Foundation, Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: corporation. Support and training for ntp-4 are Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: available at https://www.nwtime.org/support Sep 12 22:07:16.518437 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: ---------------------------------------------------- Sep 12 22:07:16.515426 ntpd[1855]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 22:07:16.515447 ntpd[1855]: ---------------------------------------------------- Sep 12 22:07:16.515466 ntpd[1855]: ntp-4 is maintained by Network Time Foundation, Sep 12 22:07:16.515482 ntpd[1855]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 22:07:16.515499 ntpd[1855]: corporation. Support and training for ntp-4 are Sep 12 22:07:16.515515 ntpd[1855]: available at https://www.nwtime.org/support Sep 12 22:07:16.515530 ntpd[1855]: ---------------------------------------------------- Sep 12 22:07:16.526624 ntpd[1855]: proto: precision = 0.096 usec (-23) Sep 12 22:07:16.528070 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: proto: precision = 0.096 usec (-23) Sep 12 22:07:16.533364 ntpd[1855]: basedate set to 2025-08-31 Sep 12 22:07:16.535104 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: basedate set to 2025-08-31 Sep 12 22:07:16.535104 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: gps base set to 2025-08-31 (week 2382) Sep 12 22:07:16.535104 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 22:07:16.535104 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 22:07:16.533414 ntpd[1855]: gps base set to 2025-08-31 (week 2382) Sep 12 22:07:16.533630 ntpd[1855]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 22:07:16.533686 ntpd[1855]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 22:07:16.535637 ntpd[1855]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 22:07:16.536603 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 22:07:16.536603 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Listen normally on 3 eth0 172.31.25.121:123 Sep 12 22:07:16.536603 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: Listen normally on 4 lo [::1]:123 Sep 12 22:07:16.536603 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: bind(21) AF_INET6 [fe80::4af:a7ff:fe36:872d%2]:123 flags 0x811 failed: Cannot assign requested address Sep 12 22:07:16.536603 ntpd[1855]: 12 Sep 22:07:16 ntpd[1855]: unable to create socket on eth0 (5) for [fe80::4af:a7ff:fe36:872d%2]:123 Sep 12 22:07:16.535706 ntpd[1855]: Listen normally on 3 eth0 172.31.25.121:123 Sep 12 22:07:16.535758 ntpd[1855]: Listen normally on 4 lo [::1]:123 Sep 12 22:07:16.535809 ntpd[1855]: bind(21) AF_INET6 [fe80::4af:a7ff:fe36:872d%2]:123 flags 0x811 failed: Cannot assign requested address Sep 12 22:07:16.535849 ntpd[1855]: unable to create socket on eth0 (5) for [fe80::4af:a7ff:fe36:872d%2]:123 Sep 12 22:07:16.552392 systemd-coredump[1967]: Process 1855 (ntpd) of user 0 terminated abnormally with signal 11/SEGV, processing... Sep 12 22:07:16.560546 systemd[1]: Created slice system-systemd\x2dcoredump.slice - Slice /system/systemd-coredump. Sep 12 22:07:16.570412 systemd[1]: Started systemd-coredump@0-1967-0.service - Process Core Dump (PID 1967/UID 0). Sep 12 22:07:16.658280 systemd-logind[1862]: New seat seat0. Sep 12 22:07:16.792694 coreos-metadata[1849]: Sep 12 22:07:16.789 INFO Putting http://169.254.169.254/latest/api/token: Attempt #2 Sep 12 22:07:16.793561 coreos-metadata[1849]: Sep 12 22:07:16.793 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 12 22:07:16.796196 coreos-metadata[1849]: Sep 12 22:07:16.795 INFO Fetch successful Sep 12 22:07:16.796196 coreos-metadata[1849]: Sep 12 22:07:16.796 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 12 22:07:16.801244 coreos-metadata[1849]: Sep 12 22:07:16.801 INFO Fetch successful Sep 12 22:07:16.801244 coreos-metadata[1849]: Sep 12 22:07:16.801 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 12 22:07:16.802974 coreos-metadata[1849]: Sep 12 22:07:16.802 INFO Fetch successful Sep 12 22:07:16.802974 coreos-metadata[1849]: Sep 12 22:07:16.802 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 12 22:07:16.805354 coreos-metadata[1849]: Sep 12 22:07:16.804 INFO Fetch successful Sep 12 22:07:16.805354 coreos-metadata[1849]: Sep 12 22:07:16.804 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 12 22:07:16.808334 coreos-metadata[1849]: Sep 12 22:07:16.808 INFO Fetch failed with 404: resource not found Sep 12 22:07:16.808334 coreos-metadata[1849]: Sep 12 22:07:16.808 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 12 22:07:16.809559 coreos-metadata[1849]: Sep 12 22:07:16.809 INFO Fetch successful Sep 12 22:07:16.809783 coreos-metadata[1849]: Sep 12 22:07:16.809 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 12 22:07:16.816974 coreos-metadata[1849]: Sep 12 22:07:16.815 INFO Fetch successful Sep 12 22:07:16.816974 coreos-metadata[1849]: Sep 12 22:07:16.815 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 12 22:07:16.818382 coreos-metadata[1849]: Sep 12 22:07:16.818 INFO Fetch successful Sep 12 22:07:16.818382 coreos-metadata[1849]: Sep 12 22:07:16.818 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 12 22:07:16.819690 coreos-metadata[1849]: Sep 12 22:07:16.819 INFO Fetch successful Sep 12 22:07:16.819690 coreos-metadata[1849]: Sep 12 22:07:16.819 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 12 22:07:16.824974 coreos-metadata[1849]: Sep 12 22:07:16.823 INFO Fetch successful Sep 12 22:07:16.860840 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:07:16.869536 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:07:16.892724 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:07:17.021285 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 22:07:17.024098 locksmithd[1917]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:07:17.024766 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:07:17.059584 containerd[1931]: time="2025-09-12T22:07:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:07:17.100002 containerd[1931]: time="2025-09-12T22:07:17.098864698Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:07:17.135215 systemd-networkd[1811]: eth0: Gained IPv6LL Sep 12 22:07:17.145313 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:07:17.149906 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:07:17.159070 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 12 22:07:17.167259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:17.173084 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:07:17.236542 containerd[1931]: time="2025-09-12T22:07:17.236114303Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="15.156µs" Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.239998727Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240072755Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240400655Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240451967Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240510983Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240640271Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:07:17.241052 containerd[1931]: time="2025-09-12T22:07:17.240671051Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:07:17.252487 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 12 22:07:17.298838 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306055055Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306122759Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306164231Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306190343Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306433187Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306862547Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306937415Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.306999647Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.307080131Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.307668095Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:07:17.308038 containerd[1931]: time="2025-09-12T22:07:17.307820819Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322627223Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322844639Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322880927Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322909259Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322959995Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.322991555Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323025899Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323056079Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323085395Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323116643Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323143499Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323176343Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323433947Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:07:17.324546 containerd[1931]: time="2025-09-12T22:07:17.323476499Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323511767Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323539655Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323566895Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323595047Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323626679Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323655047Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323685635Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323712827Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:07:17.325235 containerd[1931]: time="2025-09-12T22:07:17.323739875Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:07:17.332569 containerd[1931]: time="2025-09-12T22:07:17.331471271Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:07:17.332569 containerd[1931]: time="2025-09-12T22:07:17.331553963Z" level=info msg="Start snapshots syncer" Sep 12 22:07:17.332569 containerd[1931]: time="2025-09-12T22:07:17.331644539Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:07:17.346822 containerd[1931]: time="2025-09-12T22:07:17.342774323Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:07:17.346822 containerd[1931]: time="2025-09-12T22:07:17.342891659Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:07:17.347181 containerd[1931]: time="2025-09-12T22:07:17.343272515Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:07:17.347181 containerd[1931]: time="2025-09-12T22:07:17.346088039Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:07:17.347181 containerd[1931]: time="2025-09-12T22:07:17.346163063Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349023479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349098011Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349146707Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349178171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349208543Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349305983Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349342535Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:07:17.351003 containerd[1931]: time="2025-09-12T22:07:17.349388351Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355580555Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355662515Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355698263Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355727879Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355752839Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355780079Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.355809071Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.356014163Z" level=info msg="runtime interface created" Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.356035991Z" level=info msg="created NRI interface" Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.356060123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.356110091Z" level=info msg="Connect containerd service" Sep 12 22:07:17.361132 containerd[1931]: time="2025-09-12T22:07:17.356460731Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:07:17.371980 containerd[1931]: time="2025-09-12T22:07:17.367672139Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:07:17.429039 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:07:17.463105 systemd-logind[1862]: Watching system buttons on /dev/input/event1 (Sleep Button) Sep 12 22:07:17.473130 systemd-logind[1862]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 22:07:17.491967 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:07:17.553897 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:07:17.578703 amazon-ssm-agent[2044]: Initializing new seelog logger Sep 12 22:07:17.579600 amazon-ssm-agent[2044]: New Seelog Logger Creation Complete Sep 12 22:07:17.579836 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.579931 amazon-ssm-agent[2044]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.581069 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 processing appconfig overrides Sep 12 22:07:17.581818 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.582021 amazon-ssm-agent[2044]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.582309 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 processing appconfig overrides Sep 12 22:07:17.582738 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.583522 amazon-ssm-agent[2044]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.586295 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 processing appconfig overrides Sep 12 22:07:17.586295 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5815 INFO Proxy environment variables: Sep 12 22:07:17.589670 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.592043 amazon-ssm-agent[2044]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:17.592043 amazon-ssm-agent[2044]: 2025/09/12 22:07:17 processing appconfig overrides Sep 12 22:07:17.689997 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5817 INFO http_proxy: Sep 12 22:07:17.793811 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5817 INFO no_proxy: Sep 12 22:07:17.896012 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5817 INFO https_proxy: Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.924096962Z" level=info msg="Start subscribing containerd event" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928504982Z" level=info msg="Start recovering state" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928665146Z" level=info msg="Start event monitor" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928691198Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928712570Z" level=info msg="Start streaming server" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928732334Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928748918Z" level=info msg="runtime interface starting up..." Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928763330Z" level=info msg="starting plugins..." Sep 12 22:07:17.929153 containerd[1931]: time="2025-09-12T22:07:17.928795550Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:07:17.936537 containerd[1931]: time="2025-09-12T22:07:17.935831498Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:07:17.936537 containerd[1931]: time="2025-09-12T22:07:17.935984918Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:07:17.946968 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:07:17.955989 containerd[1931]: time="2025-09-12T22:07:17.954840122Z" level=info msg="containerd successfully booted in 0.896553s" Sep 12 22:07:18.003933 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5824 INFO Checking if agent identity type OnPrem can be assumed Sep 12 22:07:18.106978 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.5825 INFO Checking if agent identity type EC2 can be assumed Sep 12 22:07:18.204005 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9378 INFO Agent will take identity from EC2 Sep 12 22:07:18.303487 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9451 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 Sep 12 22:07:18.331989 tar[1866]: linux-arm64/LICENSE Sep 12 22:07:18.331989 tar[1866]: linux-arm64/README.md Sep 12 22:07:18.406986 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9451 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Sep 12 22:07:18.422238 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:07:18.446588 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 12 22:07:18.455219 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 12 22:07:18.459333 dbus-daemon[1850]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1940 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 12 22:07:18.469393 systemd[1]: Starting polkit.service - Authorization Manager... Sep 12 22:07:18.504249 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9451 INFO [amazon-ssm-agent] Starting Core Agent Sep 12 22:07:18.604562 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9451 INFO [amazon-ssm-agent] Registrar detected. Attempting registration Sep 12 22:07:18.677689 systemd-coredump[1972]: Process 1855 (ntpd) of user 0 dumped core. Module libnss_usrfiles.so.2 without build-id. Module libgcc_s.so.1 without build-id. Module libc.so.6 without build-id. Module libcrypto.so.3 without build-id. Module libm.so.6 without build-id. Module libcap.so.2 without build-id. Module ntpd without build-id. Stack trace of thread 1855: #0 0x0000aaaadf3a0b5c n/a (ntpd + 0x60b5c) #1 0x0000aaaadf34fe60 n/a (ntpd + 0xfe60) #2 0x0000aaaadf350240 n/a (ntpd + 0x10240) #3 0x0000aaaadf34be14 n/a (ntpd + 0xbe14) #4 0x0000aaaadf34d3ec n/a (ntpd + 0xd3ec) #5 0x0000aaaadf355a38 n/a (ntpd + 0x15a38) #6 0x0000aaaadf34738c n/a (ntpd + 0x738c) #7 0x0000ffffb3ed2034 n/a (libc.so.6 + 0x22034) #8 0x0000ffffb3ed2118 __libc_start_main (libc.so.6 + 0x22118) #9 0x0000aaaadf3473f0 n/a (ntpd + 0x73f0) ELF object binary architecture: AARCH64 Sep 12 22:07:18.685030 systemd[1]: ntpd.service: Main process exited, code=dumped, status=11/SEGV Sep 12 22:07:18.685344 systemd[1]: ntpd.service: Failed with result 'core-dump'. Sep 12 22:07:18.695026 systemd[1]: systemd-coredump@0-1967-0.service: Deactivated successfully. Sep 12 22:07:18.704362 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9451 INFO [Registrar] Starting registrar module Sep 12 22:07:18.785660 polkitd[2113]: Started polkitd version 126 Sep 12 22:07:18.804670 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9548 INFO [EC2Identity] Checking disk for registration info Sep 12 22:07:18.832412 polkitd[2113]: Loading rules from directory /etc/polkit-1/rules.d Sep 12 22:07:18.836484 polkitd[2113]: Loading rules from directory /run/polkit-1/rules.d Sep 12 22:07:18.836599 polkitd[2113]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 22:07:18.837326 polkitd[2113]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 12 22:07:18.837410 polkitd[2113]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 22:07:18.837499 polkitd[2113]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 12 22:07:18.844083 polkitd[2113]: Finished loading, compiling and executing 2 rules Sep 12 22:07:18.845531 systemd[1]: Started polkit.service - Authorization Manager. Sep 12 22:07:18.849459 systemd[1]: ntpd.service: Scheduled restart job, restart counter is at 1. Sep 12 22:07:18.854547 systemd[1]: Started ntpd.service - Network Time Service. Sep 12 22:07:18.864296 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 12 22:07:18.870094 polkitd[2113]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 12 22:07:18.905643 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9549 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration Sep 12 22:07:18.937928 ntpd[2130]: ntpd 4.2.8p18@1.4062-o Fri Sep 12 20:09:57 UTC 2025 (1): Starting Sep 12 22:07:18.939438 systemd-hostnamed[1940]: Hostname set to (transient) Sep 12 22:07:18.939890 systemd-resolved[1761]: System hostname changed to 'ip-172-31-25-121'. Sep 12 22:07:18.940110 ntpd[2130]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: ntpd 4.2.8p18@1.4062-o Fri Sep 12 20:09:57 UTC 2025 (1): Starting Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: ---------------------------------------------------- Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: ntp-4 is maintained by Network Time Foundation, Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: corporation. Support and training for ntp-4 are Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: available at https://www.nwtime.org/support Sep 12 22:07:18.940883 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: ---------------------------------------------------- Sep 12 22:07:18.940130 ntpd[2130]: ---------------------------------------------------- Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: proto: precision = 0.096 usec (-23) Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: basedate set to 2025-08-31 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: gps base set to 2025-08-31 (week 2382) Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen normally on 3 eth0 172.31.25.121:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen normally on 4 lo [::1]:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listen normally on 5 eth0 [fe80::4af:a7ff:fe36:872d%2]:123 Sep 12 22:07:18.945915 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: Listening on routing socket on fd #22 for interface updates Sep 12 22:07:18.940149 ntpd[2130]: ntp-4 is maintained by Network Time Foundation, Sep 12 22:07:18.940166 ntpd[2130]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 22:07:18.940183 ntpd[2130]: corporation. Support and training for ntp-4 are Sep 12 22:07:18.940200 ntpd[2130]: available at https://www.nwtime.org/support Sep 12 22:07:18.940217 ntpd[2130]: ---------------------------------------------------- Sep 12 22:07:18.941338 ntpd[2130]: proto: precision = 0.096 usec (-23) Sep 12 22:07:18.941652 ntpd[2130]: basedate set to 2025-08-31 Sep 12 22:07:18.941673 ntpd[2130]: gps base set to 2025-08-31 (week 2382) Sep 12 22:07:18.941798 ntpd[2130]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 22:07:18.941842 ntpd[2130]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 22:07:18.944207 ntpd[2130]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 22:07:18.944264 ntpd[2130]: Listen normally on 3 eth0 172.31.25.121:123 Sep 12 22:07:18.944316 ntpd[2130]: Listen normally on 4 lo [::1]:123 Sep 12 22:07:18.944364 ntpd[2130]: Listen normally on 5 eth0 [fe80::4af:a7ff:fe36:872d%2]:123 Sep 12 22:07:18.944411 ntpd[2130]: Listening on routing socket on fd #22 for interface updates Sep 12 22:07:18.959682 ntpd[2130]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 22:07:18.960320 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 22:07:18.960320 ntpd[2130]: 12 Sep 22:07:18 ntpd[2130]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 22:07:18.959749 ntpd[2130]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 22:07:19.006432 amazon-ssm-agent[2044]: 2025-09-12 22:07:17.9549 INFO [EC2Identity] Generating registration keypair Sep 12 22:07:19.421132 sshd_keygen[1875]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:07:19.494563 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:07:19.501592 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:07:19.508744 systemd[1]: Started sshd@0-172.31.25.121:22-139.178.89.65:44788.service - OpenSSH per-connection server daemon (139.178.89.65:44788). Sep 12 22:07:19.545486 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:07:19.547370 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:07:19.557502 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.5462 INFO [EC2Identity] Checking write access before registering Sep 12 22:07:19.558200 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:07:19.604110 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:07:19.614319 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:07:19.624729 amazon-ssm-agent[2044]: 2025/09/12 22:07:19 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:19.624729 amazon-ssm-agent[2044]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 22:07:19.626233 amazon-ssm-agent[2044]: 2025/09/12 22:07:19 processing appconfig overrides Sep 12 22:07:19.627063 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 22:07:19.630199 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:07:19.652983 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.5501 INFO [EC2Identity] Registering EC2 instance with Systems Manager Sep 12 22:07:19.674828 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6222 INFO [EC2Identity] EC2 registration was successful. Sep 12 22:07:19.674828 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6223 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. Sep 12 22:07:19.674828 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6226 INFO [CredentialRefresher] Starting credentials refresher loop Sep 12 22:07:19.675052 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6315 INFO [CredentialRefresher] credentialRefresher has started Sep 12 22:07:19.675052 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6728 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 12 22:07:19.675052 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6733 INFO [CredentialRefresher] Credentials ready Sep 12 22:07:19.753621 amazon-ssm-agent[2044]: 2025-09-12 22:07:19.6749 INFO [CredentialRefresher] Next credential rotation will be in 29.9999659768 minutes Sep 12 22:07:19.779177 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:19.784045 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:07:19.786799 systemd[1]: Startup finished in 3.691s (kernel) + 9.534s (initrd) + 9.621s (userspace) = 22.847s. Sep 12 22:07:19.797654 (kubelet)[2159]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:07:19.857737 sshd[2143]: Accepted publickey for core from 139.178.89.65 port 44788 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:19.862736 sshd-session[2143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:19.879460 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:07:19.881769 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:07:19.911362 systemd-logind[1862]: New session 1 of user core. Sep 12 22:07:19.927064 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:07:19.933871 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:07:19.957910 (systemd)[2166]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:07:19.965007 systemd-logind[1862]: New session c1 of user core. Sep 12 22:07:20.270388 systemd[2166]: Queued start job for default target default.target. Sep 12 22:07:20.289847 systemd[2166]: Created slice app.slice - User Application Slice. Sep 12 22:07:20.289913 systemd[2166]: Reached target paths.target - Paths. Sep 12 22:07:20.290037 systemd[2166]: Reached target timers.target - Timers. Sep 12 22:07:20.295136 systemd[2166]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:07:20.323014 systemd[2166]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:07:20.323478 systemd[2166]: Reached target sockets.target - Sockets. Sep 12 22:07:20.323715 systemd[2166]: Reached target basic.target - Basic System. Sep 12 22:07:20.323918 systemd[2166]: Reached target default.target - Main User Target. Sep 12 22:07:20.324137 systemd[2166]: Startup finished in 342ms. Sep 12 22:07:20.324541 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:07:20.339281 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:07:20.506150 systemd[1]: Started sshd@1-172.31.25.121:22-139.178.89.65:37728.service - OpenSSH per-connection server daemon (139.178.89.65:37728). Sep 12 22:07:20.705872 amazon-ssm-agent[2044]: 2025-09-12 22:07:20.7040 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 12 22:07:20.741757 sshd[2182]: Accepted publickey for core from 139.178.89.65 port 37728 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:20.744987 sshd-session[2182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:20.758057 systemd-logind[1862]: New session 2 of user core. Sep 12 22:07:20.764398 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:07:20.806556 amazon-ssm-agent[2044]: 2025-09-12 22:07:20.7106 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2187) started Sep 12 22:07:20.867648 kubelet[2159]: E0912 22:07:20.867461 2159 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:07:20.873237 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:07:20.873566 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:07:20.876335 systemd[1]: kubelet.service: Consumed 1.481s CPU time, 255.1M memory peak. Sep 12 22:07:20.907086 amazon-ssm-agent[2044]: 2025-09-12 22:07:20.7107 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 12 22:07:20.916653 sshd[2191]: Connection closed by 139.178.89.65 port 37728 Sep 12 22:07:20.918279 sshd-session[2182]: pam_unix(sshd:session): session closed for user core Sep 12 22:07:20.931026 systemd[1]: sshd@1-172.31.25.121:22-139.178.89.65:37728.service: Deactivated successfully. Sep 12 22:07:20.938922 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 22:07:20.946099 systemd-logind[1862]: Session 2 logged out. Waiting for processes to exit. Sep 12 22:07:20.969573 systemd[1]: Started sshd@2-172.31.25.121:22-139.178.89.65:37734.service - OpenSSH per-connection server daemon (139.178.89.65:37734). Sep 12 22:07:20.975280 systemd-logind[1862]: Removed session 2. Sep 12 22:07:21.195552 sshd[2200]: Accepted publickey for core from 139.178.89.65 port 37734 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:21.198220 sshd-session[2200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:21.209057 systemd-logind[1862]: New session 3 of user core. Sep 12 22:07:21.216298 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:07:21.339317 sshd[2208]: Connection closed by 139.178.89.65 port 37734 Sep 12 22:07:21.339083 sshd-session[2200]: pam_unix(sshd:session): session closed for user core Sep 12 22:07:21.347556 systemd[1]: sshd@2-172.31.25.121:22-139.178.89.65:37734.service: Deactivated successfully. Sep 12 22:07:21.351546 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 22:07:21.353613 systemd-logind[1862]: Session 3 logged out. Waiting for processes to exit. Sep 12 22:07:21.357540 systemd-logind[1862]: Removed session 3. Sep 12 22:07:21.376433 systemd[1]: Started sshd@3-172.31.25.121:22-139.178.89.65:37748.service - OpenSSH per-connection server daemon (139.178.89.65:37748). Sep 12 22:07:21.576687 sshd[2214]: Accepted publickey for core from 139.178.89.65 port 37748 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:21.579320 sshd-session[2214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:21.589351 systemd-logind[1862]: New session 4 of user core. Sep 12 22:07:21.599302 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:07:21.724989 sshd[2217]: Connection closed by 139.178.89.65 port 37748 Sep 12 22:07:21.725276 sshd-session[2214]: pam_unix(sshd:session): session closed for user core Sep 12 22:07:21.731922 systemd[1]: sshd@3-172.31.25.121:22-139.178.89.65:37748.service: Deactivated successfully. Sep 12 22:07:21.735542 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:07:21.741333 systemd-logind[1862]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:07:21.744082 systemd-logind[1862]: Removed session 4. Sep 12 22:07:21.762336 systemd[1]: Started sshd@4-172.31.25.121:22-139.178.89.65:37758.service - OpenSSH per-connection server daemon (139.178.89.65:37758). Sep 12 22:07:21.970988 sshd[2223]: Accepted publickey for core from 139.178.89.65 port 37758 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:21.973751 sshd-session[2223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:21.984058 systemd-logind[1862]: New session 5 of user core. Sep 12 22:07:21.991306 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:07:22.195137 sudo[2227]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:07:22.196384 sudo[2227]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:07:22.213119 sudo[2227]: pam_unix(sudo:session): session closed for user root Sep 12 22:07:22.237300 sshd[2226]: Connection closed by 139.178.89.65 port 37758 Sep 12 22:07:22.238417 sshd-session[2223]: pam_unix(sshd:session): session closed for user core Sep 12 22:07:22.247215 systemd[1]: sshd@4-172.31.25.121:22-139.178.89.65:37758.service: Deactivated successfully. Sep 12 22:07:22.250526 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:07:22.252667 systemd-logind[1862]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:07:22.255824 systemd-logind[1862]: Removed session 5. Sep 12 22:07:22.276272 systemd[1]: Started sshd@5-172.31.25.121:22-139.178.89.65:37764.service - OpenSSH per-connection server daemon (139.178.89.65:37764). Sep 12 22:07:22.482196 sshd[2233]: Accepted publickey for core from 139.178.89.65 port 37764 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:22.484794 sshd-session[2233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:22.494026 systemd-logind[1862]: New session 6 of user core. Sep 12 22:07:22.514738 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:07:22.623291 sudo[2238]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:07:22.624651 sudo[2238]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:07:22.637806 sudo[2238]: pam_unix(sudo:session): session closed for user root Sep 12 22:07:22.649636 sudo[2237]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:07:22.650434 sudo[2237]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:07:22.669120 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:07:22.754935 augenrules[2260]: No rules Sep 12 22:07:22.757446 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:07:22.758258 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:07:22.761540 sudo[2237]: pam_unix(sudo:session): session closed for user root Sep 12 22:07:22.786126 sshd[2236]: Connection closed by 139.178.89.65 port 37764 Sep 12 22:07:22.787010 sshd-session[2233]: pam_unix(sshd:session): session closed for user core Sep 12 22:07:22.794267 systemd-logind[1862]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:07:22.794794 systemd[1]: sshd@5-172.31.25.121:22-139.178.89.65:37764.service: Deactivated successfully. Sep 12 22:07:22.799655 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:07:22.805843 systemd-logind[1862]: Removed session 6. Sep 12 22:07:22.823882 systemd[1]: Started sshd@6-172.31.25.121:22-139.178.89.65:37778.service - OpenSSH per-connection server daemon (139.178.89.65:37778). Sep 12 22:07:23.023272 sshd[2269]: Accepted publickey for core from 139.178.89.65 port 37778 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:07:23.025840 sshd-session[2269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:07:23.036059 systemd-logind[1862]: New session 7 of user core. Sep 12 22:07:23.045287 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:07:23.152648 sudo[2273]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:07:23.154179 sudo[2273]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:07:23.976526 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:07:23.993631 (dockerd)[2290]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:07:24.545439 dockerd[2290]: time="2025-09-12T22:07:24.545323195Z" level=info msg="Starting up" Sep 12 22:07:24.548499 dockerd[2290]: time="2025-09-12T22:07:24.548421535Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:07:24.574118 dockerd[2290]: time="2025-09-12T22:07:24.574022467Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:07:24.687311 systemd[1]: var-lib-docker-metacopy\x2dcheck2993917199-merged.mount: Deactivated successfully. Sep 12 22:07:24.695454 dockerd[2290]: time="2025-09-12T22:07:24.694999628Z" level=info msg="Loading containers: start." Sep 12 22:07:24.719027 kernel: Initializing XFRM netlink socket Sep 12 22:07:25.171685 (udev-worker)[2313]: Network interface NamePolicy= disabled on kernel command line. Sep 12 22:07:25.262141 systemd-networkd[1811]: docker0: Link UP Sep 12 22:07:25.268172 dockerd[2290]: time="2025-09-12T22:07:25.268083475Z" level=info msg="Loading containers: done." Sep 12 22:07:25.300063 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck26983070-merged.mount: Deactivated successfully. Sep 12 22:07:25.304749 dockerd[2290]: time="2025-09-12T22:07:25.304682803Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:07:25.305090 dockerd[2290]: time="2025-09-12T22:07:25.305040895Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:07:25.305455 dockerd[2290]: time="2025-09-12T22:07:25.305410531Z" level=info msg="Initializing buildkit" Sep 12 22:07:25.363499 dockerd[2290]: time="2025-09-12T22:07:25.363411667Z" level=info msg="Completed buildkit initialization" Sep 12 22:07:25.380288 dockerd[2290]: time="2025-09-12T22:07:25.380198791Z" level=info msg="Daemon has completed initialization" Sep 12 22:07:25.380700 dockerd[2290]: time="2025-09-12T22:07:25.380501275Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:07:25.382273 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:07:25.494969 systemd-resolved[1761]: Clock change detected. Flushing caches. Sep 12 22:07:26.067092 containerd[1931]: time="2025-09-12T22:07:26.066527886Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 22:07:26.699688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2616666301.mount: Deactivated successfully. Sep 12 22:07:28.069830 containerd[1931]: time="2025-09-12T22:07:28.069748232Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:28.071764 containerd[1931]: time="2025-09-12T22:07:28.071691572Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=25687325" Sep 12 22:07:28.074182 containerd[1931]: time="2025-09-12T22:07:28.074078144Z" level=info msg="ImageCreate event name:\"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:28.080743 containerd[1931]: time="2025-09-12T22:07:28.079851428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:28.081880 containerd[1931]: time="2025-09-12T22:07:28.081819860Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"25683924\" in 2.015228002s" Sep 12 22:07:28.081991 containerd[1931]: time="2025-09-12T22:07:28.081883964Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\"" Sep 12 22:07:28.084768 containerd[1931]: time="2025-09-12T22:07:28.084700100Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 22:07:29.769211 containerd[1931]: time="2025-09-12T22:07:29.769104661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:29.771593 containerd[1931]: time="2025-09-12T22:07:29.771147229Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=22459767" Sep 12 22:07:29.773620 containerd[1931]: time="2025-09-12T22:07:29.773563297Z" level=info msg="ImageCreate event name:\"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:29.782572 containerd[1931]: time="2025-09-12T22:07:29.782454901Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:29.784463 containerd[1931]: time="2025-09-12T22:07:29.784390765Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"24028542\" in 1.699626297s" Sep 12 22:07:29.784463 containerd[1931]: time="2025-09-12T22:07:29.784457257Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\"" Sep 12 22:07:29.785799 containerd[1931]: time="2025-09-12T22:07:29.785098177Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 22:07:30.474032 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:07:30.476924 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:30.840619 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:30.854688 (kubelet)[2571]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:07:30.955948 kubelet[2571]: E0912 22:07:30.955852 2571 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:07:30.967722 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:07:30.968622 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:07:30.969394 systemd[1]: kubelet.service: Consumed 335ms CPU time, 105.4M memory peak. Sep 12 22:07:31.532099 containerd[1931]: time="2025-09-12T22:07:31.532037293Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:31.535662 containerd[1931]: time="2025-09-12T22:07:31.535583846Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=17127506" Sep 12 22:07:31.538492 containerd[1931]: time="2025-09-12T22:07:31.538323794Z" level=info msg="ImageCreate event name:\"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:31.545260 containerd[1931]: time="2025-09-12T22:07:31.545161562Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:31.548102 containerd[1931]: time="2025-09-12T22:07:31.547299530Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"18696299\" in 1.761350433s" Sep 12 22:07:31.548102 containerd[1931]: time="2025-09-12T22:07:31.547391054Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\"" Sep 12 22:07:31.548517 containerd[1931]: time="2025-09-12T22:07:31.548450210Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 22:07:33.029067 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1050293998.mount: Deactivated successfully. Sep 12 22:07:33.589331 containerd[1931]: time="2025-09-12T22:07:33.588218704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:33.589854 containerd[1931]: time="2025-09-12T22:07:33.589481740Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=26954907" Sep 12 22:07:33.590640 containerd[1931]: time="2025-09-12T22:07:33.590596900Z" level=info msg="ImageCreate event name:\"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:33.594210 containerd[1931]: time="2025-09-12T22:07:33.594163540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:33.595272 containerd[1931]: time="2025-09-12T22:07:33.595213480Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"26953926\" in 2.046692038s" Sep 12 22:07:33.595574 containerd[1931]: time="2025-09-12T22:07:33.595271188Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\"" Sep 12 22:07:33.596257 containerd[1931]: time="2025-09-12T22:07:33.596169340Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 22:07:34.103370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2325241785.mount: Deactivated successfully. Sep 12 22:07:35.400142 containerd[1931]: time="2025-09-12T22:07:35.399075605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:35.401307 containerd[1931]: time="2025-09-12T22:07:35.401245025Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 12 22:07:35.401652 containerd[1931]: time="2025-09-12T22:07:35.401619209Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:35.407711 containerd[1931]: time="2025-09-12T22:07:35.407660417Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:35.409620 containerd[1931]: time="2025-09-12T22:07:35.409533533Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.813269993s" Sep 12 22:07:35.409780 containerd[1931]: time="2025-09-12T22:07:35.409752341Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 22:07:35.411287 containerd[1931]: time="2025-09-12T22:07:35.411130373Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:07:35.856037 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2116051784.mount: Deactivated successfully. Sep 12 22:07:35.866777 containerd[1931]: time="2025-09-12T22:07:35.866631043Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:07:35.867894 containerd[1931]: time="2025-09-12T22:07:35.867827227Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 12 22:07:35.868894 containerd[1931]: time="2025-09-12T22:07:35.868806355Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:07:35.873368 containerd[1931]: time="2025-09-12T22:07:35.873310795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:07:35.874214 containerd[1931]: time="2025-09-12T22:07:35.874158355Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 462.681254ms" Sep 12 22:07:35.874298 containerd[1931]: time="2025-09-12T22:07:35.874209835Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 22:07:35.874996 containerd[1931]: time="2025-09-12T22:07:35.874953163Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 22:07:36.414761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4178190511.mount: Deactivated successfully. Sep 12 22:07:38.659325 containerd[1931]: time="2025-09-12T22:07:38.659232717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:38.661243 containerd[1931]: time="2025-09-12T22:07:38.661160661Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537161" Sep 12 22:07:38.665958 containerd[1931]: time="2025-09-12T22:07:38.665873169Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:38.672656 containerd[1931]: time="2025-09-12T22:07:38.672583377Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:38.675001 containerd[1931]: time="2025-09-12T22:07:38.674942541Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.799929102s" Sep 12 22:07:38.675349 containerd[1931]: time="2025-09-12T22:07:38.674998941Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 12 22:07:40.974103 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:07:40.977432 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:41.300383 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:41.313832 (kubelet)[2729]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:07:41.389130 kubelet[2729]: E0912 22:07:41.388040 2729 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:07:41.393795 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:07:41.394140 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:07:41.394983 systemd[1]: kubelet.service: Consumed 293ms CPU time, 105.5M memory peak. Sep 12 22:07:47.813166 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:47.813500 systemd[1]: kubelet.service: Consumed 293ms CPU time, 105.5M memory peak. Sep 12 22:07:47.818685 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:47.869763 systemd[1]: Reload requested from client PID 2744 ('systemctl') (unit session-7.scope)... Sep 12 22:07:47.869798 systemd[1]: Reloading... Sep 12 22:07:48.112764 zram_generator::config[2792]: No configuration found. Sep 12 22:07:48.557026 systemd[1]: Reloading finished in 686 ms. Sep 12 22:07:48.604637 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 12 22:07:48.674517 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:48.681635 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:07:48.682067 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:48.682188 systemd[1]: kubelet.service: Consumed 233ms CPU time, 95M memory peak. Sep 12 22:07:48.685654 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:49.066774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:49.082673 (kubelet)[2857]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:07:49.155148 kubelet[2857]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:07:49.155148 kubelet[2857]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:07:49.155148 kubelet[2857]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:07:49.155148 kubelet[2857]: I0912 22:07:49.154097 2857 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:07:49.939786 kubelet[2857]: I0912 22:07:49.939720 2857 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:07:49.939786 kubelet[2857]: I0912 22:07:49.939776 2857 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:07:49.941147 kubelet[2857]: I0912 22:07:49.940659 2857 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:07:49.994208 kubelet[2857]: E0912 22:07:49.994160 2857 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.25.121:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:49.995906 kubelet[2857]: I0912 22:07:49.995872 2857 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:07:50.009402 kubelet[2857]: I0912 22:07:50.009371 2857 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:07:50.016661 kubelet[2857]: I0912 22:07:50.016600 2857 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:07:50.018537 kubelet[2857]: I0912 22:07:50.018509 2857 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:07:50.019128 kubelet[2857]: I0912 22:07:50.018968 2857 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:07:50.019410 kubelet[2857]: I0912 22:07:50.019017 2857 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-25-121","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:07:50.019811 kubelet[2857]: I0912 22:07:50.019789 2857 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:07:50.020667 kubelet[2857]: I0912 22:07:50.019893 2857 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:07:50.020667 kubelet[2857]: I0912 22:07:50.020349 2857 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:07:50.025847 kubelet[2857]: I0912 22:07:50.025812 2857 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:07:50.026011 kubelet[2857]: I0912 22:07:50.025993 2857 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:07:50.026151 kubelet[2857]: I0912 22:07:50.026133 2857 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:07:50.026274 kubelet[2857]: I0912 22:07:50.026255 2857 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:07:50.033408 kubelet[2857]: W0912 22:07:50.033321 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.25.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-25-121&limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:50.033529 kubelet[2857]: E0912 22:07:50.033416 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.25.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-25-121&limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:50.034612 kubelet[2857]: W0912 22:07:50.034536 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.25.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:50.034715 kubelet[2857]: E0912 22:07:50.034628 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.25.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:50.035099 kubelet[2857]: I0912 22:07:50.035063 2857 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:07:50.036428 kubelet[2857]: I0912 22:07:50.036383 2857 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:07:50.036763 kubelet[2857]: W0912 22:07:50.036728 2857 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:07:50.040645 kubelet[2857]: I0912 22:07:50.040581 2857 server.go:1274] "Started kubelet" Sep 12 22:07:50.052148 kubelet[2857]: I0912 22:07:50.049749 2857 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:07:50.053101 kubelet[2857]: I0912 22:07:50.053070 2857 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:07:50.053737 kubelet[2857]: I0912 22:07:50.053690 2857 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:07:50.057276 kubelet[2857]: I0912 22:07:50.057192 2857 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:07:50.057806 kubelet[2857]: I0912 22:07:50.057776 2857 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:07:50.065364 kubelet[2857]: I0912 22:07:50.065320 2857 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:07:50.066238 kubelet[2857]: E0912 22:07:50.062202 2857 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.25.121:6443/api/v1/namespaces/default/events\": dial tcp 172.31.25.121:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-25-121.1864a85c8d304801 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-25-121,UID:ip-172-31-25-121,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-25-121,},FirstTimestamp:2025-09-12 22:07:50.040545281 +0000 UTC m=+0.951562961,LastTimestamp:2025-09-12 22:07:50.040545281 +0000 UTC m=+0.951562961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-25-121,}" Sep 12 22:07:50.069760 kubelet[2857]: E0912 22:07:50.069701 2857 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-25-121\" not found" Sep 12 22:07:50.069984 kubelet[2857]: I0912 22:07:50.069965 2857 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:07:50.070454 kubelet[2857]: I0912 22:07:50.070430 2857 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:07:50.070668 kubelet[2857]: I0912 22:07:50.070648 2857 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:07:50.071629 kubelet[2857]: W0912 22:07:50.071538 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.25.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:50.071918 kubelet[2857]: E0912 22:07:50.071880 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.25.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:50.072221 kubelet[2857]: E0912 22:07:50.072178 2857 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.25.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-25-121?timeout=10s\": dial tcp 172.31.25.121:6443: connect: connection refused" interval="200ms" Sep 12 22:07:50.072608 kubelet[2857]: I0912 22:07:50.072579 2857 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:07:50.072835 kubelet[2857]: I0912 22:07:50.072806 2857 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:07:50.074662 kubelet[2857]: E0912 22:07:50.074602 2857 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:07:50.076291 kubelet[2857]: I0912 22:07:50.075754 2857 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:07:50.091516 kubelet[2857]: I0912 22:07:50.091440 2857 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:07:50.094900 kubelet[2857]: I0912 22:07:50.094834 2857 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:07:50.094900 kubelet[2857]: I0912 22:07:50.094885 2857 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:07:50.095080 kubelet[2857]: I0912 22:07:50.094920 2857 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:07:50.095080 kubelet[2857]: E0912 22:07:50.094995 2857 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:07:50.112156 kubelet[2857]: W0912 22:07:50.111934 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.25.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:50.112156 kubelet[2857]: E0912 22:07:50.112057 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.25.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:50.120447 kubelet[2857]: I0912 22:07:50.120406 2857 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:07:50.120447 kubelet[2857]: I0912 22:07:50.120440 2857 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:07:50.120639 kubelet[2857]: I0912 22:07:50.120474 2857 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:07:50.126175 kubelet[2857]: I0912 22:07:50.126094 2857 policy_none.go:49] "None policy: Start" Sep 12 22:07:50.127568 kubelet[2857]: I0912 22:07:50.127506 2857 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:07:50.127568 kubelet[2857]: I0912 22:07:50.127579 2857 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:07:50.140497 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:07:50.163129 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:07:50.170741 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:07:50.171333 kubelet[2857]: E0912 22:07:50.170978 2857 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-25-121\" not found" Sep 12 22:07:50.184181 kubelet[2857]: I0912 22:07:50.183742 2857 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:07:50.184181 kubelet[2857]: I0912 22:07:50.184045 2857 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:07:50.184181 kubelet[2857]: I0912 22:07:50.184066 2857 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:07:50.185014 kubelet[2857]: I0912 22:07:50.184983 2857 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:07:50.188753 kubelet[2857]: E0912 22:07:50.188693 2857 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-25-121\" not found" Sep 12 22:07:50.215959 systemd[1]: Created slice kubepods-burstable-poddff8e9a77d1371a67531f3d8bd4eebcf.slice - libcontainer container kubepods-burstable-poddff8e9a77d1371a67531f3d8bd4eebcf.slice. Sep 12 22:07:50.243685 systemd[1]: Created slice kubepods-burstable-pod7efbf3b4b891e6abb0c6a52f319b1ecd.slice - libcontainer container kubepods-burstable-pod7efbf3b4b891e6abb0c6a52f319b1ecd.slice. Sep 12 22:07:50.261817 systemd[1]: Created slice kubepods-burstable-podcf7297cebdba29ede2190343dcd1358e.slice - libcontainer container kubepods-burstable-podcf7297cebdba29ede2190343dcd1358e.slice. Sep 12 22:07:50.272820 kubelet[2857]: E0912 22:07:50.272760 2857 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.25.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-25-121?timeout=10s\": dial tcp 172.31.25.121:6443: connect: connection refused" interval="400ms" Sep 12 22:07:50.290284 kubelet[2857]: I0912 22:07:50.290173 2857 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-25-121" Sep 12 22:07:50.291003 kubelet[2857]: E0912 22:07:50.290958 2857 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.25.121:6443/api/v1/nodes\": dial tcp 172.31.25.121:6443: connect: connection refused" node="ip-172-31-25-121" Sep 12 22:07:50.372452 kubelet[2857]: I0912 22:07:50.372345 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-ca-certs\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:50.372452 kubelet[2857]: I0912 22:07:50.372402 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-kubeconfig\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:50.372616 kubelet[2857]: I0912 22:07:50.372458 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:50.372616 kubelet[2857]: I0912 22:07:50.372518 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cf7297cebdba29ede2190343dcd1358e-kubeconfig\") pod \"kube-scheduler-ip-172-31-25-121\" (UID: \"cf7297cebdba29ede2190343dcd1358e\") " pod="kube-system/kube-scheduler-ip-172-31-25-121" Sep 12 22:07:50.372616 kubelet[2857]: I0912 22:07:50.372568 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-ca-certs\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:50.372753 kubelet[2857]: I0912 22:07:50.372618 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:50.372753 kubelet[2857]: I0912 22:07:50.372654 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:50.372753 kubelet[2857]: I0912 22:07:50.372703 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-k8s-certs\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:50.372753 kubelet[2857]: I0912 22:07:50.372741 2857 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-k8s-certs\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:50.494297 kubelet[2857]: I0912 22:07:50.494238 2857 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-25-121" Sep 12 22:07:50.494818 kubelet[2857]: E0912 22:07:50.494768 2857 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.25.121:6443/api/v1/nodes\": dial tcp 172.31.25.121:6443: connect: connection refused" node="ip-172-31-25-121" Sep 12 22:07:50.537914 containerd[1931]: time="2025-09-12T22:07:50.537838772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-25-121,Uid:dff8e9a77d1371a67531f3d8bd4eebcf,Namespace:kube-system,Attempt:0,}" Sep 12 22:07:50.549786 containerd[1931]: time="2025-09-12T22:07:50.549451484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-25-121,Uid:7efbf3b4b891e6abb0c6a52f319b1ecd,Namespace:kube-system,Attempt:0,}" Sep 12 22:07:50.568266 containerd[1931]: time="2025-09-12T22:07:50.568215920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-25-121,Uid:cf7297cebdba29ede2190343dcd1358e,Namespace:kube-system,Attempt:0,}" Sep 12 22:07:50.596729 containerd[1931]: time="2025-09-12T22:07:50.596674388Z" level=info msg="connecting to shim 3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf" address="unix:///run/containerd/s/ac60f74721d8e77ab7176ae23cbbbe7471970966fb967676582a1835857ce486" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:07:50.652193 containerd[1931]: time="2025-09-12T22:07:50.651786284Z" level=info msg="connecting to shim b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c" address="unix:///run/containerd/s/47ec793ac8680dcac9feb0343f991dff5fa7204b5912548efc7b9e40270c6999" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:07:50.654836 containerd[1931]: time="2025-09-12T22:07:50.654779852Z" level=info msg="connecting to shim a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1" address="unix:///run/containerd/s/08f1beb3d528403a618e40a25a938851ebfcba1061ca6e924a6fa52e29ce3014" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:07:50.675298 kubelet[2857]: E0912 22:07:50.674033 2857 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.25.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-25-121?timeout=10s\": dial tcp 172.31.25.121:6443: connect: connection refused" interval="800ms" Sep 12 22:07:50.683780 systemd[1]: Started cri-containerd-3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf.scope - libcontainer container 3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf. Sep 12 22:07:50.739428 systemd[1]: Started cri-containerd-b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c.scope - libcontainer container b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c. Sep 12 22:07:50.763474 systemd[1]: Started cri-containerd-a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1.scope - libcontainer container a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1. Sep 12 22:07:50.837757 containerd[1931]: time="2025-09-12T22:07:50.837649713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-25-121,Uid:dff8e9a77d1371a67531f3d8bd4eebcf,Namespace:kube-system,Attempt:0,} returns sandbox id \"3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf\"" Sep 12 22:07:50.849149 containerd[1931]: time="2025-09-12T22:07:50.848775669Z" level=info msg="CreateContainer within sandbox \"3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:07:50.878673 containerd[1931]: time="2025-09-12T22:07:50.878622958Z" level=info msg="Container c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:07:50.899450 kubelet[2857]: I0912 22:07:50.899413 2857 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-25-121" Sep 12 22:07:50.900389 kubelet[2857]: E0912 22:07:50.900325 2857 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.25.121:6443/api/v1/nodes\": dial tcp 172.31.25.121:6443: connect: connection refused" node="ip-172-31-25-121" Sep 12 22:07:50.912259 containerd[1931]: time="2025-09-12T22:07:50.912178078Z" level=info msg="CreateContainer within sandbox \"3f1f39ff66b53ed05fe7b1af2c6e3d115e6b9cd78e94a218c4fc27713133fedf\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967\"" Sep 12 22:07:50.916587 containerd[1931]: time="2025-09-12T22:07:50.916447738Z" level=info msg="StartContainer for \"c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967\"" Sep 12 22:07:50.920794 containerd[1931]: time="2025-09-12T22:07:50.920734126Z" level=info msg="connecting to shim c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967" address="unix:///run/containerd/s/ac60f74721d8e77ab7176ae23cbbbe7471970966fb967676582a1835857ce486" protocol=ttrpc version=3 Sep 12 22:07:50.921763 containerd[1931]: time="2025-09-12T22:07:50.921668734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-25-121,Uid:cf7297cebdba29ede2190343dcd1358e,Namespace:kube-system,Attempt:0,} returns sandbox id \"a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1\"" Sep 12 22:07:50.927672 containerd[1931]: time="2025-09-12T22:07:50.927588346Z" level=info msg="CreateContainer within sandbox \"a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:07:50.928370 containerd[1931]: time="2025-09-12T22:07:50.928212406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-25-121,Uid:7efbf3b4b891e6abb0c6a52f319b1ecd,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c\"" Sep 12 22:07:50.936167 containerd[1931]: time="2025-09-12T22:07:50.935630206Z" level=info msg="CreateContainer within sandbox \"b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:07:50.958615 containerd[1931]: time="2025-09-12T22:07:50.954995374Z" level=info msg="Container 8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:07:50.972462 containerd[1931]: time="2025-09-12T22:07:50.972368170Z" level=info msg="Container cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:07:50.977435 systemd[1]: Started cri-containerd-c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967.scope - libcontainer container c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967. Sep 12 22:07:50.990397 kubelet[2857]: W0912 22:07:50.990260 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.25.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:50.991830 kubelet[2857]: E0912 22:07:50.990536 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.25.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:50.997821 containerd[1931]: time="2025-09-12T22:07:50.996080194Z" level=info msg="CreateContainer within sandbox \"a76cb676c060a0d167722b5f5f1c58e174ae9b36336099b3f9cff3c60d0ab5c1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6\"" Sep 12 22:07:50.998654 containerd[1931]: time="2025-09-12T22:07:50.998595838Z" level=info msg="StartContainer for \"8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6\"" Sep 12 22:07:51.003141 containerd[1931]: time="2025-09-12T22:07:51.002179506Z" level=info msg="connecting to shim 8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6" address="unix:///run/containerd/s/08f1beb3d528403a618e40a25a938851ebfcba1061ca6e924a6fa52e29ce3014" protocol=ttrpc version=3 Sep 12 22:07:51.003514 containerd[1931]: time="2025-09-12T22:07:51.003141078Z" level=info msg="CreateContainer within sandbox \"b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\"" Sep 12 22:07:51.004623 containerd[1931]: time="2025-09-12T22:07:51.004559334Z" level=info msg="StartContainer for \"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\"" Sep 12 22:07:51.013148 containerd[1931]: time="2025-09-12T22:07:51.011803590Z" level=info msg="connecting to shim cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6" address="unix:///run/containerd/s/47ec793ac8680dcac9feb0343f991dff5fa7204b5912548efc7b9e40270c6999" protocol=ttrpc version=3 Sep 12 22:07:51.042188 kubelet[2857]: W0912 22:07:51.041200 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.25.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:51.042188 kubelet[2857]: E0912 22:07:51.041428 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.25.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:51.074438 systemd[1]: Started cri-containerd-cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6.scope - libcontainer container cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6. Sep 12 22:07:51.097444 systemd[1]: Started cri-containerd-8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6.scope - libcontainer container 8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6. Sep 12 22:07:51.140908 containerd[1931]: time="2025-09-12T22:07:51.140844655Z" level=info msg="StartContainer for \"c53dddd78697289d0ca288657c355c1e01ff011d6e831e4f8ee4d2c2d0247967\" returns successfully" Sep 12 22:07:51.226594 containerd[1931]: time="2025-09-12T22:07:51.226474075Z" level=info msg="StartContainer for \"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\" returns successfully" Sep 12 22:07:51.322013 containerd[1931]: time="2025-09-12T22:07:51.321843656Z" level=info msg="StartContainer for \"8d6eb268da56d1deb120d9ef0b598724d5e15d593cccf4c7a467c17198cb19a6\" returns successfully" Sep 12 22:07:51.384702 kubelet[2857]: W0912 22:07:51.384533 2857 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.25.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.25.121:6443: connect: connection refused Sep 12 22:07:51.385394 kubelet[2857]: E0912 22:07:51.384663 2857 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.25.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.25.121:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:07:51.702852 kubelet[2857]: I0912 22:07:51.702674 2857 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-25-121" Sep 12 22:07:54.675929 kubelet[2857]: I0912 22:07:54.675860 2857 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-25-121" Sep 12 22:07:55.035613 kubelet[2857]: I0912 22:07:55.035549 2857 apiserver.go:52] "Watching apiserver" Sep 12 22:07:55.071136 kubelet[2857]: I0912 22:07:55.071046 2857 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:07:56.561014 systemd[1]: Reload requested from client PID 3131 ('systemctl') (unit session-7.scope)... Sep 12 22:07:56.561045 systemd[1]: Reloading... Sep 12 22:07:56.782161 zram_generator::config[3175]: No configuration found. Sep 12 22:07:57.377525 systemd[1]: Reloading finished in 815 ms. Sep 12 22:07:57.423721 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:57.450694 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:07:57.452242 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:57.452470 systemd[1]: kubelet.service: Consumed 1.640s CPU time, 125.3M memory peak. Sep 12 22:07:57.456043 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:07:57.826917 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:07:57.843713 (kubelet)[3235]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:07:57.943299 kubelet[3235]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:07:57.943733 kubelet[3235]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:07:57.943733 kubelet[3235]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:07:57.943733 kubelet[3235]: I0912 22:07:57.943525 3235 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:07:57.958503 kubelet[3235]: I0912 22:07:57.958448 3235 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:07:57.958503 kubelet[3235]: I0912 22:07:57.958495 3235 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:07:57.959258 kubelet[3235]: I0912 22:07:57.959163 3235 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:07:57.971949 kubelet[3235]: I0912 22:07:57.971713 3235 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 22:07:57.984647 kubelet[3235]: I0912 22:07:57.984399 3235 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:07:57.996037 kubelet[3235]: I0912 22:07:57.995988 3235 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:07:58.001532 kubelet[3235]: I0912 22:07:58.001485 3235 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:07:58.001797 kubelet[3235]: I0912 22:07:58.001744 3235 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:07:58.002028 kubelet[3235]: I0912 22:07:58.001956 3235 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:07:58.003048 kubelet[3235]: I0912 22:07:58.002087 3235 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-25-121","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:07:58.003307 kubelet[3235]: I0912 22:07:58.003064 3235 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:07:58.003307 kubelet[3235]: I0912 22:07:58.003086 3235 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:07:58.003307 kubelet[3235]: I0912 22:07:58.003220 3235 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:07:58.004366 kubelet[3235]: I0912 22:07:58.003404 3235 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:07:58.004366 kubelet[3235]: I0912 22:07:58.003427 3235 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:07:58.004366 kubelet[3235]: I0912 22:07:58.003462 3235 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:07:58.004366 kubelet[3235]: I0912 22:07:58.003488 3235 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:07:58.008537 kubelet[3235]: I0912 22:07:58.008404 3235 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:07:58.011168 kubelet[3235]: I0912 22:07:58.010497 3235 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:07:58.012012 kubelet[3235]: I0912 22:07:58.011922 3235 server.go:1274] "Started kubelet" Sep 12 22:07:58.016702 kubelet[3235]: I0912 22:07:58.016637 3235 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:07:58.020641 kubelet[3235]: I0912 22:07:58.020157 3235 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:07:58.023223 kubelet[3235]: I0912 22:07:58.021811 3235 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:07:58.023223 kubelet[3235]: I0912 22:07:58.023084 3235 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:07:58.035132 kubelet[3235]: I0912 22:07:58.033750 3235 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:07:58.047481 kubelet[3235]: I0912 22:07:58.047425 3235 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:07:58.047713 kubelet[3235]: I0912 22:07:58.047692 3235 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:07:58.049126 kubelet[3235]: E0912 22:07:58.049071 3235 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-25-121\" not found" Sep 12 22:07:58.057189 kubelet[3235]: I0912 22:07:58.056641 3235 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:07:58.059548 kubelet[3235]: I0912 22:07:58.059493 3235 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:07:58.062715 kubelet[3235]: I0912 22:07:58.059690 3235 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:07:58.062715 kubelet[3235]: I0912 22:07:58.060938 3235 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:07:58.069213 kubelet[3235]: I0912 22:07:58.068444 3235 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:07:58.111528 kubelet[3235]: E0912 22:07:58.111322 3235 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:07:58.143700 kubelet[3235]: I0912 22:07:58.143595 3235 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:07:58.151810 kubelet[3235]: I0912 22:07:58.151742 3235 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:07:58.151810 kubelet[3235]: I0912 22:07:58.151791 3235 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:07:58.152004 kubelet[3235]: I0912 22:07:58.151824 3235 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:07:58.152004 kubelet[3235]: E0912 22:07:58.151895 3235 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:07:58.245442 kubelet[3235]: I0912 22:07:58.245390 3235 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:07:58.246100 kubelet[3235]: I0912 22:07:58.245635 3235 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:07:58.246100 kubelet[3235]: I0912 22:07:58.245675 3235 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:07:58.246100 kubelet[3235]: I0912 22:07:58.245921 3235 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:07:58.246100 kubelet[3235]: I0912 22:07:58.245941 3235 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:07:58.246100 kubelet[3235]: I0912 22:07:58.245974 3235 policy_none.go:49] "None policy: Start" Sep 12 22:07:58.249101 kubelet[3235]: I0912 22:07:58.248688 3235 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:07:58.249101 kubelet[3235]: I0912 22:07:58.248730 3235 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:07:58.249101 kubelet[3235]: I0912 22:07:58.248983 3235 state_mem.go:75] "Updated machine memory state" Sep 12 22:07:58.252081 kubelet[3235]: E0912 22:07:58.252020 3235 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 22:07:58.261068 kubelet[3235]: I0912 22:07:58.261024 3235 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:07:58.261649 kubelet[3235]: I0912 22:07:58.261537 3235 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:07:58.261649 kubelet[3235]: I0912 22:07:58.261583 3235 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:07:58.262618 kubelet[3235]: I0912 22:07:58.262563 3235 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:07:58.407374 kubelet[3235]: I0912 22:07:58.407062 3235 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-25-121" Sep 12 22:07:58.427146 kubelet[3235]: I0912 22:07:58.426746 3235 kubelet_node_status.go:111] "Node was previously registered" node="ip-172-31-25-121" Sep 12 22:07:58.427146 kubelet[3235]: I0912 22:07:58.426857 3235 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-25-121" Sep 12 22:07:58.462497 kubelet[3235]: I0912 22:07:58.462427 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-ca-certs\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:58.462497 kubelet[3235]: I0912 22:07:58.462496 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:58.462705 kubelet[3235]: I0912 22:07:58.462541 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cf7297cebdba29ede2190343dcd1358e-kubeconfig\") pod \"kube-scheduler-ip-172-31-25-121\" (UID: \"cf7297cebdba29ede2190343dcd1358e\") " pod="kube-system/kube-scheduler-ip-172-31-25-121" Sep 12 22:07:58.462705 kubelet[3235]: I0912 22:07:58.462580 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-ca-certs\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:58.462705 kubelet[3235]: I0912 22:07:58.462613 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-k8s-certs\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:58.462705 kubelet[3235]: I0912 22:07:58.462647 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dff8e9a77d1371a67531f3d8bd4eebcf-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-25-121\" (UID: \"dff8e9a77d1371a67531f3d8bd4eebcf\") " pod="kube-system/kube-apiserver-ip-172-31-25-121" Sep 12 22:07:58.462705 kubelet[3235]: I0912 22:07:58.462686 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-k8s-certs\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:58.462964 kubelet[3235]: I0912 22:07:58.462720 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-kubeconfig\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:58.462964 kubelet[3235]: I0912 22:07:58.462756 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7efbf3b4b891e6abb0c6a52f319b1ecd-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-25-121\" (UID: \"7efbf3b4b891e6abb0c6a52f319b1ecd\") " pod="kube-system/kube-controller-manager-ip-172-31-25-121" Sep 12 22:07:59.016984 kubelet[3235]: I0912 22:07:59.016565 3235 apiserver.go:52] "Watching apiserver" Sep 12 22:07:59.057873 kubelet[3235]: I0912 22:07:59.057805 3235 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:07:59.263368 kubelet[3235]: I0912 22:07:59.262559 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-25-121" podStartSLOduration=1.262539651 podStartE2EDuration="1.262539651s" podCreationTimestamp="2025-09-12 22:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:07:59.261062055 +0000 UTC m=+1.406725520" watchObservedRunningTime="2025-09-12 22:07:59.262539651 +0000 UTC m=+1.408203104" Sep 12 22:07:59.327389 kubelet[3235]: I0912 22:07:59.327210 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-25-121" podStartSLOduration=1.327169192 podStartE2EDuration="1.327169192s" podCreationTimestamp="2025-09-12 22:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:07:59.288551091 +0000 UTC m=+1.434214580" watchObservedRunningTime="2025-09-12 22:07:59.327169192 +0000 UTC m=+1.472832657" Sep 12 22:07:59.373943 kubelet[3235]: I0912 22:07:59.373852 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-25-121" podStartSLOduration=1.373832572 podStartE2EDuration="1.373832572s" podCreationTimestamp="2025-09-12 22:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:07:59.329512072 +0000 UTC m=+1.475175549" watchObservedRunningTime="2025-09-12 22:07:59.373832572 +0000 UTC m=+1.519496037" Sep 12 22:08:00.495409 update_engine[1863]: I20250912 22:08:00.494586 1863 update_attempter.cc:509] Updating boot flags... Sep 12 22:08:01.531085 kubelet[3235]: I0912 22:08:01.530366 3235 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:08:01.537717 containerd[1931]: time="2025-09-12T22:08:01.530835174Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:08:01.541533 kubelet[3235]: I0912 22:08:01.534426 3235 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:08:02.429329 systemd[1]: Created slice kubepods-besteffort-podcac70605_e506_4716_b918_14dc91886709.slice - libcontainer container kubepods-besteffort-podcac70605_e506_4716_b918_14dc91886709.slice. Sep 12 22:08:02.496132 kubelet[3235]: I0912 22:08:02.495972 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/cac70605-e506-4716-b918-14dc91886709-kube-proxy\") pod \"kube-proxy-k292z\" (UID: \"cac70605-e506-4716-b918-14dc91886709\") " pod="kube-system/kube-proxy-k292z" Sep 12 22:08:02.496132 kubelet[3235]: I0912 22:08:02.496035 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cac70605-e506-4716-b918-14dc91886709-lib-modules\") pod \"kube-proxy-k292z\" (UID: \"cac70605-e506-4716-b918-14dc91886709\") " pod="kube-system/kube-proxy-k292z" Sep 12 22:08:02.496132 kubelet[3235]: I0912 22:08:02.496082 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cac70605-e506-4716-b918-14dc91886709-xtables-lock\") pod \"kube-proxy-k292z\" (UID: \"cac70605-e506-4716-b918-14dc91886709\") " pod="kube-system/kube-proxy-k292z" Sep 12 22:08:02.497489 kubelet[3235]: I0912 22:08:02.497413 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9rw\" (UniqueName: \"kubernetes.io/projected/cac70605-e506-4716-b918-14dc91886709-kube-api-access-6z9rw\") pod \"kube-proxy-k292z\" (UID: \"cac70605-e506-4716-b918-14dc91886709\") " pod="kube-system/kube-proxy-k292z" Sep 12 22:08:02.575619 systemd[1]: Created slice kubepods-besteffort-pod92db8785_e136_43a0_a22e_2e85da07b066.slice - libcontainer container kubepods-besteffort-pod92db8785_e136_43a0_a22e_2e85da07b066.slice. Sep 12 22:08:02.598560 kubelet[3235]: I0912 22:08:02.598501 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vjf\" (UniqueName: \"kubernetes.io/projected/92db8785-e136-43a0-a22e-2e85da07b066-kube-api-access-h6vjf\") pod \"tigera-operator-58fc44c59b-bmdjt\" (UID: \"92db8785-e136-43a0-a22e-2e85da07b066\") " pod="tigera-operator/tigera-operator-58fc44c59b-bmdjt" Sep 12 22:08:02.600540 kubelet[3235]: I0912 22:08:02.598632 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/92db8785-e136-43a0-a22e-2e85da07b066-var-lib-calico\") pod \"tigera-operator-58fc44c59b-bmdjt\" (UID: \"92db8785-e136-43a0-a22e-2e85da07b066\") " pod="tigera-operator/tigera-operator-58fc44c59b-bmdjt" Sep 12 22:08:02.745512 containerd[1931]: time="2025-09-12T22:08:02.745451517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k292z,Uid:cac70605-e506-4716-b918-14dc91886709,Namespace:kube-system,Attempt:0,}" Sep 12 22:08:02.787733 containerd[1931]: time="2025-09-12T22:08:02.787647453Z" level=info msg="connecting to shim 70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454" address="unix:///run/containerd/s/9ad8e0cf281bc6c430d2133571fc8c249a1fb8974437f91b34063eb1e9da1faf" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:02.842701 systemd[1]: Started cri-containerd-70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454.scope - libcontainer container 70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454. Sep 12 22:08:02.885736 containerd[1931]: time="2025-09-12T22:08:02.885657165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bmdjt,Uid:92db8785-e136-43a0-a22e-2e85da07b066,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:08:02.900472 containerd[1931]: time="2025-09-12T22:08:02.900401061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k292z,Uid:cac70605-e506-4716-b918-14dc91886709,Namespace:kube-system,Attempt:0,} returns sandbox id \"70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454\"" Sep 12 22:08:02.908881 containerd[1931]: time="2025-09-12T22:08:02.908078205Z" level=info msg="CreateContainer within sandbox \"70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:08:02.938330 containerd[1931]: time="2025-09-12T22:08:02.938265069Z" level=info msg="connecting to shim 56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56" address="unix:///run/containerd/s/e820addc2a39748336c63a9a7a6dd025ede3711630398f4343c20392dcb69e12" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:02.944081 containerd[1931]: time="2025-09-12T22:08:02.944032846Z" level=info msg="Container 8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:02.966494 containerd[1931]: time="2025-09-12T22:08:02.966425518Z" level=info msg="CreateContainer within sandbox \"70fb4c4d6c06fa8f3e65ac93ca648567b0571532c12eb4eeaa81a7fe7f645454\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b\"" Sep 12 22:08:02.968187 containerd[1931]: time="2025-09-12T22:08:02.967487122Z" level=info msg="StartContainer for \"8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b\"" Sep 12 22:08:02.970984 containerd[1931]: time="2025-09-12T22:08:02.970592110Z" level=info msg="connecting to shim 8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b" address="unix:///run/containerd/s/9ad8e0cf281bc6c430d2133571fc8c249a1fb8974437f91b34063eb1e9da1faf" protocol=ttrpc version=3 Sep 12 22:08:02.991756 systemd[1]: Started cri-containerd-56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56.scope - libcontainer container 56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56. Sep 12 22:08:03.024452 systemd[1]: Started cri-containerd-8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b.scope - libcontainer container 8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b. Sep 12 22:08:03.105949 containerd[1931]: time="2025-09-12T22:08:03.105858666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bmdjt,Uid:92db8785-e136-43a0-a22e-2e85da07b066,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56\"" Sep 12 22:08:03.118147 containerd[1931]: time="2025-09-12T22:08:03.116576370Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:08:03.153763 containerd[1931]: time="2025-09-12T22:08:03.152770411Z" level=info msg="StartContainer for \"8a325846903b48a6ed6a4791479fa23e7453d3c12600065898ae15d145d2f91b\" returns successfully" Sep 12 22:08:03.842644 kubelet[3235]: I0912 22:08:03.842394 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-k292z" podStartSLOduration=1.842250202 podStartE2EDuration="1.842250202s" podCreationTimestamp="2025-09-12 22:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:08:03.255367003 +0000 UTC m=+5.401030456" watchObservedRunningTime="2025-09-12 22:08:03.842250202 +0000 UTC m=+5.987913667" Sep 12 22:08:04.423740 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3734029418.mount: Deactivated successfully. Sep 12 22:08:05.422697 containerd[1931]: time="2025-09-12T22:08:05.422617990Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:05.424079 containerd[1931]: time="2025-09-12T22:08:05.424026718Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 22:08:05.426163 containerd[1931]: time="2025-09-12T22:08:05.424998586Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:05.428905 containerd[1931]: time="2025-09-12T22:08:05.428858086Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:05.430368 containerd[1931]: time="2025-09-12T22:08:05.430286050Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.313647616s" Sep 12 22:08:05.430481 containerd[1931]: time="2025-09-12T22:08:05.430344898Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 22:08:05.435876 containerd[1931]: time="2025-09-12T22:08:05.435816250Z" level=info msg="CreateContainer within sandbox \"56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:08:05.449928 containerd[1931]: time="2025-09-12T22:08:05.449864062Z" level=info msg="Container 115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:05.460594 containerd[1931]: time="2025-09-12T22:08:05.460515730Z" level=info msg="CreateContainer within sandbox \"56e4823f66197ddbda6b97bbfb7c6402df603daf68475dedb57cca5523c99a56\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2\"" Sep 12 22:08:05.463313 containerd[1931]: time="2025-09-12T22:08:05.463229950Z" level=info msg="StartContainer for \"115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2\"" Sep 12 22:08:05.465576 containerd[1931]: time="2025-09-12T22:08:05.465228814Z" level=info msg="connecting to shim 115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2" address="unix:///run/containerd/s/e820addc2a39748336c63a9a7a6dd025ede3711630398f4343c20392dcb69e12" protocol=ttrpc version=3 Sep 12 22:08:05.506510 systemd[1]: Started cri-containerd-115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2.scope - libcontainer container 115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2. Sep 12 22:08:05.600174 containerd[1931]: time="2025-09-12T22:08:05.599695451Z" level=info msg="StartContainer for \"115d954cd4cfdf90462c071508b6436b287edebddc4db71127de933cab1a1ff2\" returns successfully" Sep 12 22:08:08.004467 kubelet[3235]: I0912 22:08:08.004253 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-bmdjt" podStartSLOduration=3.687956327 podStartE2EDuration="6.004229819s" podCreationTimestamp="2025-09-12 22:08:02 +0000 UTC" firstStartedPulling="2025-09-12 22:08:03.11587131 +0000 UTC m=+5.261534763" lastFinishedPulling="2025-09-12 22:08:05.432144802 +0000 UTC m=+7.577808255" observedRunningTime="2025-09-12 22:08:06.274273078 +0000 UTC m=+8.419936555" watchObservedRunningTime="2025-09-12 22:08:08.004229819 +0000 UTC m=+10.149893260" Sep 12 22:08:14.303724 sudo[2273]: pam_unix(sudo:session): session closed for user root Sep 12 22:08:14.330908 sshd[2272]: Connection closed by 139.178.89.65 port 37778 Sep 12 22:08:14.331743 sshd-session[2269]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:14.340928 systemd[1]: sshd@6-172.31.25.121:22-139.178.89.65:37778.service: Deactivated successfully. Sep 12 22:08:14.349635 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:08:14.350778 systemd[1]: session-7.scope: Consumed 12.675s CPU time, 219.8M memory peak. Sep 12 22:08:14.356598 systemd-logind[1862]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:08:14.365201 systemd-logind[1862]: Removed session 7. Sep 12 22:08:24.998127 kubelet[3235]: W0912 22:08:24.998050 3235 reflector.go:561] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:24.998978 kubelet[3235]: E0912 22:08:24.998301 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:25.000606 systemd[1]: Created slice kubepods-besteffort-pod9ef7ce93_3db8_4309_9718_589cf8396dca.slice - libcontainer container kubepods-besteffort-pod9ef7ce93_3db8_4309_9718_589cf8396dca.slice. Sep 12 22:08:25.003691 kubelet[3235]: W0912 22:08:25.002613 3235 reflector.go:561] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:25.003691 kubelet[3235]: E0912 22:08:25.002679 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:25.003691 kubelet[3235]: W0912 22:08:25.002773 3235 reflector.go:561] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:25.003691 kubelet[3235]: E0912 22:08:25.002800 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:25.053602 kubelet[3235]: I0912 22:08:25.053520 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zfv\" (UniqueName: \"kubernetes.io/projected/9ef7ce93-3db8-4309-9718-589cf8396dca-kube-api-access-m9zfv\") pod \"calico-typha-585bb46969-jdpjr\" (UID: \"9ef7ce93-3db8-4309-9718-589cf8396dca\") " pod="calico-system/calico-typha-585bb46969-jdpjr" Sep 12 22:08:25.053602 kubelet[3235]: I0912 22:08:25.053589 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9ef7ce93-3db8-4309-9718-589cf8396dca-typha-certs\") pod \"calico-typha-585bb46969-jdpjr\" (UID: \"9ef7ce93-3db8-4309-9718-589cf8396dca\") " pod="calico-system/calico-typha-585bb46969-jdpjr" Sep 12 22:08:25.053901 kubelet[3235]: I0912 22:08:25.053657 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ef7ce93-3db8-4309-9718-589cf8396dca-tigera-ca-bundle\") pod \"calico-typha-585bb46969-jdpjr\" (UID: \"9ef7ce93-3db8-4309-9718-589cf8396dca\") " pod="calico-system/calico-typha-585bb46969-jdpjr" Sep 12 22:08:25.247511 systemd[1]: Created slice kubepods-besteffort-pode334e072_c69d_4c1e_aa37_4e31e49dfbe6.slice - libcontainer container kubepods-besteffort-pode334e072_c69d_4c1e_aa37_4e31e49dfbe6.slice. Sep 12 22:08:25.256984 kubelet[3235]: I0912 22:08:25.256176 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-cni-bin-dir\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.256984 kubelet[3235]: I0912 22:08:25.256346 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-lib-modules\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.256984 kubelet[3235]: I0912 22:08:25.256386 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-tigera-ca-bundle\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.256984 kubelet[3235]: I0912 22:08:25.256424 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwxk\" (UniqueName: \"kubernetes.io/projected/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-kube-api-access-tfwxk\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.256984 kubelet[3235]: I0912 22:08:25.256473 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-var-lib-calico\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257387 kubelet[3235]: I0912 22:08:25.256509 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-var-run-calico\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257387 kubelet[3235]: I0912 22:08:25.256555 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-cni-net-dir\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257387 kubelet[3235]: I0912 22:08:25.256598 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-node-certs\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257387 kubelet[3235]: I0912 22:08:25.256663 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-xtables-lock\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257387 kubelet[3235]: I0912 22:08:25.256719 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-cni-log-dir\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257627 kubelet[3235]: I0912 22:08:25.256756 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-flexvol-driver-host\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.257627 kubelet[3235]: I0912 22:08:25.256794 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e334e072-c69d-4c1e-aa37-4e31e49dfbe6-policysync\") pod \"calico-node-qpcp8\" (UID: \"e334e072-c69d-4c1e-aa37-4e31e49dfbe6\") " pod="calico-system/calico-node-qpcp8" Sep 12 22:08:25.374333 kubelet[3235]: E0912 22:08:25.372461 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.374333 kubelet[3235]: W0912 22:08:25.374178 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.374333 kubelet[3235]: E0912 22:08:25.374250 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.459371 kubelet[3235]: E0912 22:08:25.459318 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.459371 kubelet[3235]: W0912 22:08:25.459359 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.459565 kubelet[3235]: E0912 22:08:25.459391 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.459798 kubelet[3235]: E0912 22:08:25.459758 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.459798 kubelet[3235]: W0912 22:08:25.459789 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.459927 kubelet[3235]: E0912 22:08:25.459812 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.461589 kubelet[3235]: E0912 22:08:25.461532 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.461589 kubelet[3235]: W0912 22:08:25.461574 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.461789 kubelet[3235]: E0912 22:08:25.461607 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.462011 kubelet[3235]: E0912 22:08:25.461971 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.462011 kubelet[3235]: W0912 22:08:25.462003 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.462161 kubelet[3235]: E0912 22:08:25.462026 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.464379 kubelet[3235]: E0912 22:08:25.463392 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.464379 kubelet[3235]: W0912 22:08:25.463427 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.464379 kubelet[3235]: E0912 22:08:25.463460 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.562762 kubelet[3235]: E0912 22:08:25.562380 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:25.564747 kubelet[3235]: E0912 22:08:25.564701 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.564747 kubelet[3235]: W0912 22:08:25.564738 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.564939 kubelet[3235]: E0912 22:08:25.564769 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.565781 kubelet[3235]: E0912 22:08:25.565191 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.565781 kubelet[3235]: W0912 22:08:25.565220 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.565781 kubelet[3235]: E0912 22:08:25.565244 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.566909 kubelet[3235]: E0912 22:08:25.566840 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.566909 kubelet[3235]: W0912 22:08:25.566879 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.567082 kubelet[3235]: E0912 22:08:25.566912 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.568426 kubelet[3235]: E0912 22:08:25.568374 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.568426 kubelet[3235]: W0912 22:08:25.568414 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.569153 kubelet[3235]: E0912 22:08:25.568446 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.569911 kubelet[3235]: E0912 22:08:25.569864 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.569911 kubelet[3235]: W0912 22:08:25.569901 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.570745 kubelet[3235]: E0912 22:08:25.569933 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.641087 kubelet[3235]: E0912 22:08:25.641048 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.641367 kubelet[3235]: W0912 22:08:25.641335 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.641723 kubelet[3235]: E0912 22:08:25.641478 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.644312 kubelet[3235]: E0912 22:08:25.644133 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.644312 kubelet[3235]: W0912 22:08:25.644174 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.644312 kubelet[3235]: E0912 22:08:25.644208 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.644972 kubelet[3235]: E0912 22:08:25.644937 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.645284 kubelet[3235]: W0912 22:08:25.645149 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.645284 kubelet[3235]: E0912 22:08:25.645188 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.645710 kubelet[3235]: E0912 22:08:25.645683 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.645841 kubelet[3235]: W0912 22:08:25.645816 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.645947 kubelet[3235]: E0912 22:08:25.645924 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.646422 kubelet[3235]: E0912 22:08:25.646398 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.646678 kubelet[3235]: W0912 22:08:25.646568 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.646678 kubelet[3235]: E0912 22:08:25.646602 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.647103 kubelet[3235]: E0912 22:08:25.647077 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.647367 kubelet[3235]: W0912 22:08:25.647247 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.647367 kubelet[3235]: E0912 22:08:25.647283 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.648872 kubelet[3235]: E0912 22:08:25.648088 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.649372 kubelet[3235]: W0912 22:08:25.649077 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.649372 kubelet[3235]: E0912 22:08:25.649176 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.649682 kubelet[3235]: E0912 22:08:25.649658 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.649795 kubelet[3235]: W0912 22:08:25.649770 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.650019 kubelet[3235]: E0912 22:08:25.649905 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.650612 kubelet[3235]: E0912 22:08:25.650477 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.650612 kubelet[3235]: W0912 22:08:25.650505 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.650612 kubelet[3235]: E0912 22:08:25.650531 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.651930 kubelet[3235]: E0912 22:08:25.651773 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.651930 kubelet[3235]: W0912 22:08:25.651804 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.651930 kubelet[3235]: E0912 22:08:25.651842 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.652700 kubelet[3235]: E0912 22:08:25.652560 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.652700 kubelet[3235]: W0912 22:08:25.652588 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.652700 kubelet[3235]: E0912 22:08:25.652616 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.654165 kubelet[3235]: E0912 22:08:25.654088 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.654504 kubelet[3235]: W0912 22:08:25.654339 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.654504 kubelet[3235]: E0912 22:08:25.654385 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.655101 kubelet[3235]: E0912 22:08:25.654950 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.655101 kubelet[3235]: W0912 22:08:25.654975 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.655101 kubelet[3235]: E0912 22:08:25.655000 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.655808 kubelet[3235]: E0912 22:08:25.655658 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.655808 kubelet[3235]: W0912 22:08:25.655686 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.655808 kubelet[3235]: E0912 22:08:25.655713 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.656833 kubelet[3235]: E0912 22:08:25.656659 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.656833 kubelet[3235]: W0912 22:08:25.656697 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.656833 kubelet[3235]: E0912 22:08:25.656729 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.658740 kubelet[3235]: E0912 22:08:25.658686 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.659059 kubelet[3235]: W0912 22:08:25.658881 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.659059 kubelet[3235]: E0912 22:08:25.658921 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.660318 kubelet[3235]: E0912 22:08:25.660286 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.660629 kubelet[3235]: W0912 22:08:25.660494 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.660629 kubelet[3235]: E0912 22:08:25.660535 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.661157 kubelet[3235]: E0912 22:08:25.661089 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.661743 kubelet[3235]: W0912 22:08:25.661600 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.661743 kubelet[3235]: E0912 22:08:25.661644 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.662387 kubelet[3235]: E0912 22:08:25.662242 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.662387 kubelet[3235]: W0912 22:08:25.662270 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.662387 kubelet[3235]: E0912 22:08:25.662298 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.663150 kubelet[3235]: E0912 22:08:25.662946 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.663150 kubelet[3235]: W0912 22:08:25.662974 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.663150 kubelet[3235]: E0912 22:08:25.663001 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.671329 kubelet[3235]: E0912 22:08:25.671226 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.671329 kubelet[3235]: W0912 22:08:25.671259 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.671329 kubelet[3235]: E0912 22:08:25.671292 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.671980 kubelet[3235]: I0912 22:08:25.671648 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/746510a2-33f4-43d1-a7e4-66416d6cce62-registration-dir\") pod \"csi-node-driver-npwnh\" (UID: \"746510a2-33f4-43d1-a7e4-66416d6cce62\") " pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:25.672248 kubelet[3235]: E0912 22:08:25.672223 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.672358 kubelet[3235]: W0912 22:08:25.672334 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.672485 kubelet[3235]: E0912 22:08:25.672462 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.673274 kubelet[3235]: E0912 22:08:25.673094 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.673274 kubelet[3235]: W0912 22:08:25.673148 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.673274 kubelet[3235]: E0912 22:08:25.673193 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.673274 kubelet[3235]: I0912 22:08:25.673235 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/746510a2-33f4-43d1-a7e4-66416d6cce62-socket-dir\") pod \"csi-node-driver-npwnh\" (UID: \"746510a2-33f4-43d1-a7e4-66416d6cce62\") " pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:25.674366 kubelet[3235]: E0912 22:08:25.674317 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.674366 kubelet[3235]: W0912 22:08:25.674355 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.674688 kubelet[3235]: E0912 22:08:25.674389 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.675416 kubelet[3235]: E0912 22:08:25.675349 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.675416 kubelet[3235]: W0912 22:08:25.675380 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.676179 kubelet[3235]: E0912 22:08:25.676021 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.676711 kubelet[3235]: E0912 22:08:25.676648 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.676711 kubelet[3235]: W0912 22:08:25.676677 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.678189 kubelet[3235]: E0912 22:08:25.677767 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.678459 kubelet[3235]: E0912 22:08:25.678431 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.678577 kubelet[3235]: W0912 22:08:25.678552 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.679608 kubelet[3235]: E0912 22:08:25.679350 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.679608 kubelet[3235]: I0912 22:08:25.679417 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/746510a2-33f4-43d1-a7e4-66416d6cce62-kubelet-dir\") pod \"csi-node-driver-npwnh\" (UID: \"746510a2-33f4-43d1-a7e4-66416d6cce62\") " pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:25.680604 kubelet[3235]: E0912 22:08:25.680354 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.680604 kubelet[3235]: W0912 22:08:25.680387 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.680604 kubelet[3235]: E0912 22:08:25.680418 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.681467 kubelet[3235]: E0912 22:08:25.681235 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.681467 kubelet[3235]: W0912 22:08:25.681263 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.681467 kubelet[3235]: E0912 22:08:25.681289 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.682857 kubelet[3235]: E0912 22:08:25.682681 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.683676 kubelet[3235]: W0912 22:08:25.683354 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.683676 kubelet[3235]: E0912 22:08:25.683404 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.684242 kubelet[3235]: E0912 22:08:25.684218 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.684816 kubelet[3235]: W0912 22:08:25.684518 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.684816 kubelet[3235]: E0912 22:08:25.684550 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.684816 kubelet[3235]: I0912 22:08:25.684588 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/746510a2-33f4-43d1-a7e4-66416d6cce62-varrun\") pod \"csi-node-driver-npwnh\" (UID: \"746510a2-33f4-43d1-a7e4-66416d6cce62\") " pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:25.685861 kubelet[3235]: E0912 22:08:25.685799 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.685861 kubelet[3235]: W0912 22:08:25.685829 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.686092 kubelet[3235]: E0912 22:08:25.686069 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.686590 kubelet[3235]: E0912 22:08:25.686540 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.686590 kubelet[3235]: W0912 22:08:25.686562 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.686871 kubelet[3235]: E0912 22:08:25.686826 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.687313 kubelet[3235]: E0912 22:08:25.687261 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.687313 kubelet[3235]: W0912 22:08:25.687284 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.687628 kubelet[3235]: E0912 22:08:25.687580 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.687992 kubelet[3235]: E0912 22:08:25.687970 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.688141 kubelet[3235]: W0912 22:08:25.688075 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.688450 kubelet[3235]: E0912 22:08:25.688406 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.688918 kubelet[3235]: E0912 22:08:25.688819 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.688918 kubelet[3235]: W0912 22:08:25.688847 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.688918 kubelet[3235]: E0912 22:08:25.688874 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.690032 kubelet[3235]: E0912 22:08:25.689937 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.690032 kubelet[3235]: W0912 22:08:25.689968 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.690032 kubelet[3235]: E0912 22:08:25.689998 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.691430 kubelet[3235]: E0912 22:08:25.691325 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.691430 kubelet[3235]: W0912 22:08:25.691359 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.691430 kubelet[3235]: E0912 22:08:25.691389 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.692188 kubelet[3235]: I0912 22:08:25.691791 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k95f9\" (UniqueName: \"kubernetes.io/projected/746510a2-33f4-43d1-a7e4-66416d6cce62-kube-api-access-k95f9\") pod \"csi-node-driver-npwnh\" (UID: \"746510a2-33f4-43d1-a7e4-66416d6cce62\") " pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:25.692456 kubelet[3235]: E0912 22:08:25.692431 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.692722 kubelet[3235]: W0912 22:08:25.692594 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.692722 kubelet[3235]: E0912 22:08:25.692634 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.693440 kubelet[3235]: E0912 22:08:25.693338 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.693440 kubelet[3235]: W0912 22:08:25.693368 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.693440 kubelet[3235]: E0912 22:08:25.693396 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.794867 kubelet[3235]: E0912 22:08:25.794760 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.794867 kubelet[3235]: W0912 22:08:25.794799 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.794867 kubelet[3235]: E0912 22:08:25.794830 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.796274 kubelet[3235]: E0912 22:08:25.796226 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.796541 kubelet[3235]: W0912 22:08:25.796419 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.796541 kubelet[3235]: E0912 22:08:25.796480 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.798973 kubelet[3235]: E0912 22:08:25.798904 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.798973 kubelet[3235]: W0912 22:08:25.798936 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.799397 kubelet[3235]: E0912 22:08:25.799274 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.799883 kubelet[3235]: E0912 22:08:25.799820 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.799883 kubelet[3235]: W0912 22:08:25.799849 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.800214 kubelet[3235]: E0912 22:08:25.800069 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.800693 kubelet[3235]: E0912 22:08:25.800636 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.800693 kubelet[3235]: W0912 22:08:25.800663 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.801135 kubelet[3235]: E0912 22:08:25.800834 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.802301 kubelet[3235]: E0912 22:08:25.802228 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.802301 kubelet[3235]: W0912 22:08:25.802261 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.802922 kubelet[3235]: E0912 22:08:25.802647 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.804187 kubelet[3235]: E0912 22:08:25.804143 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.804507 kubelet[3235]: W0912 22:08:25.804389 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.805022 kubelet[3235]: E0912 22:08:25.804626 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.806786 kubelet[3235]: E0912 22:08:25.806717 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.806786 kubelet[3235]: W0912 22:08:25.806748 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.807281 kubelet[3235]: E0912 22:08:25.807164 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.808162 kubelet[3235]: E0912 22:08:25.807886 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.808162 kubelet[3235]: W0912 22:08:25.807952 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.808997 kubelet[3235]: E0912 22:08:25.808925 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.810159 kubelet[3235]: E0912 22:08:25.809549 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.810561 kubelet[3235]: W0912 22:08:25.810313 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.810561 kubelet[3235]: E0912 22:08:25.810400 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.810907 kubelet[3235]: E0912 22:08:25.810861 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.811194 kubelet[3235]: W0912 22:08:25.811162 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.811394 kubelet[3235]: E0912 22:08:25.811351 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.812137 kubelet[3235]: E0912 22:08:25.812034 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.812137 kubelet[3235]: W0912 22:08:25.812067 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.812325 kubelet[3235]: E0912 22:08:25.812181 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.812910 kubelet[3235]: E0912 22:08:25.812795 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.812910 kubelet[3235]: W0912 22:08:25.812829 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.813669 kubelet[3235]: E0912 22:08:25.813622 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.817272 kubelet[3235]: E0912 22:08:25.817231 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.817272 kubelet[3235]: W0912 22:08:25.817267 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.817613 kubelet[3235]: E0912 22:08:25.817484 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.817869 kubelet[3235]: E0912 22:08:25.817840 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.817939 kubelet[3235]: W0912 22:08:25.817868 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.817989 kubelet[3235]: E0912 22:08:25.817976 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.819079 kubelet[3235]: E0912 22:08:25.819048 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.819079 kubelet[3235]: W0912 22:08:25.819076 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.819261 kubelet[3235]: E0912 22:08:25.819203 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.821287 kubelet[3235]: E0912 22:08:25.821245 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.821287 kubelet[3235]: W0912 22:08:25.821282 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.821475 kubelet[3235]: E0912 22:08:25.821330 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.822001 kubelet[3235]: E0912 22:08:25.821970 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.822085 kubelet[3235]: W0912 22:08:25.821999 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.822085 kubelet[3235]: E0912 22:08:25.822189 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.824414 kubelet[3235]: E0912 22:08:25.824341 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.824414 kubelet[3235]: W0912 22:08:25.824379 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.824414 kubelet[3235]: E0912 22:08:25.824412 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.825406 kubelet[3235]: E0912 22:08:25.824750 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.825406 kubelet[3235]: W0912 22:08:25.824784 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.825406 kubelet[3235]: E0912 22:08:25.824807 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.825406 kubelet[3235]: E0912 22:08:25.825307 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.825406 kubelet[3235]: W0912 22:08:25.825327 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.825406 kubelet[3235]: E0912 22:08:25.825364 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.826744 kubelet[3235]: E0912 22:08:25.826363 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.826744 kubelet[3235]: W0912 22:08:25.826390 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.826744 kubelet[3235]: E0912 22:08:25.826434 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.828019 kubelet[3235]: E0912 22:08:25.827965 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.828019 kubelet[3235]: W0912 22:08:25.828003 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.828335 kubelet[3235]: E0912 22:08:25.828187 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.828388 kubelet[3235]: E0912 22:08:25.828371 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.828474 kubelet[3235]: W0912 22:08:25.828385 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.828474 kubelet[3235]: E0912 22:08:25.828422 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.828889 kubelet[3235]: E0912 22:08:25.828860 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.828970 kubelet[3235]: W0912 22:08:25.828889 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.828970 kubelet[3235]: E0912 22:08:25.828929 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.830607 kubelet[3235]: E0912 22:08:25.830567 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.830607 kubelet[3235]: W0912 22:08:25.830603 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.831104 kubelet[3235]: E0912 22:08:25.830650 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.831440 kubelet[3235]: E0912 22:08:25.831387 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.831440 kubelet[3235]: W0912 22:08:25.831427 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.831754 kubelet[3235]: E0912 22:08:25.831491 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.832474 kubelet[3235]: E0912 22:08:25.832428 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.832474 kubelet[3235]: W0912 22:08:25.832461 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.832612 kubelet[3235]: E0912 22:08:25.832492 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.834156 kubelet[3235]: E0912 22:08:25.834061 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.834156 kubelet[3235]: W0912 22:08:25.834102 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.834319 kubelet[3235]: E0912 22:08:25.834182 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.835622 kubelet[3235]: E0912 22:08:25.835555 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.835622 kubelet[3235]: W0912 22:08:25.835592 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.835622 kubelet[3235]: E0912 22:08:25.835624 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.929387 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.930432 kubelet[3235]: W0912 22:08:25.929427 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.929467 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.929817 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.930432 kubelet[3235]: W0912 22:08:25.929836 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.929857 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.930198 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.930432 kubelet[3235]: W0912 22:08:25.930217 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.930432 kubelet[3235]: E0912 22:08:25.930238 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.930958 kubelet[3235]: E0912 22:08:25.930579 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.930958 kubelet[3235]: W0912 22:08:25.930597 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.930958 kubelet[3235]: E0912 22:08:25.930618 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.930958 kubelet[3235]: E0912 22:08:25.930881 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.930958 kubelet[3235]: W0912 22:08:25.930896 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.930958 kubelet[3235]: E0912 22:08:25.930913 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.931861 kubelet[3235]: E0912 22:08:25.931187 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.931861 kubelet[3235]: W0912 22:08:25.931205 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.931861 kubelet[3235]: E0912 22:08:25.931229 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:25.939311 kubelet[3235]: E0912 22:08:25.939262 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:25.939311 kubelet[3235]: W0912 22:08:25.939301 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:25.939496 kubelet[3235]: E0912 22:08:25.939341 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.032948 kubelet[3235]: E0912 22:08:26.032904 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.032948 kubelet[3235]: W0912 22:08:26.032940 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.033593 kubelet[3235]: E0912 22:08:26.032972 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.034215 kubelet[3235]: E0912 22:08:26.034164 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.034215 kubelet[3235]: W0912 22:08:26.034202 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.034381 kubelet[3235]: E0912 22:08:26.034235 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.036064 kubelet[3235]: E0912 22:08:26.036020 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.036064 kubelet[3235]: W0912 22:08:26.036059 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.036579 kubelet[3235]: E0912 22:08:26.036092 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.036929 kubelet[3235]: E0912 22:08:26.036860 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.036929 kubelet[3235]: W0912 22:08:26.036894 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.037071 kubelet[3235]: E0912 22:08:26.036938 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.037496 kubelet[3235]: E0912 22:08:26.037465 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.037656 kubelet[3235]: W0912 22:08:26.037495 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.037656 kubelet[3235]: E0912 22:08:26.037519 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.138704 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.140245 kubelet[3235]: W0912 22:08:26.138741 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.138771 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.139516 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.140245 kubelet[3235]: W0912 22:08:26.139545 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.139574 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.140192 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.140245 kubelet[3235]: W0912 22:08:26.140216 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.140245 kubelet[3235]: E0912 22:08:26.140242 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.142613 kubelet[3235]: E0912 22:08:26.140932 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.142613 kubelet[3235]: W0912 22:08:26.140966 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.142613 kubelet[3235]: E0912 22:08:26.140992 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.142613 kubelet[3235]: E0912 22:08:26.141560 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.142613 kubelet[3235]: W0912 22:08:26.141579 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.142613 kubelet[3235]: E0912 22:08:26.141600 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.156038 kubelet[3235]: E0912 22:08:26.155718 3235 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:08:26.158276 kubelet[3235]: E0912 22:08:26.156331 3235 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ef7ce93-3db8-4309-9718-589cf8396dca-tigera-ca-bundle podName:9ef7ce93-3db8-4309-9718-589cf8396dca nodeName:}" failed. No retries permitted until 2025-09-12 22:08:26.656291629 +0000 UTC m=+28.801955070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/9ef7ce93-3db8-4309-9718-589cf8396dca-tigera-ca-bundle") pod "calico-typha-585bb46969-jdpjr" (UID: "9ef7ce93-3db8-4309-9718-589cf8396dca") : failed to sync configmap cache: timed out waiting for the condition Sep 12 22:08:26.169486 kubelet[3235]: E0912 22:08:26.169437 3235 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:08:26.170455 kubelet[3235]: E0912 22:08:26.170373 3235 projected.go:194] Error preparing data for projected volume kube-api-access-m9zfv for pod calico-system/calico-typha-585bb46969-jdpjr: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:08:26.170983 kubelet[3235]: E0912 22:08:26.170954 3235 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ef7ce93-3db8-4309-9718-589cf8396dca-kube-api-access-m9zfv podName:9ef7ce93-3db8-4309-9718-589cf8396dca nodeName:}" failed. No retries permitted until 2025-09-12 22:08:26.670646845 +0000 UTC m=+28.816310298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m9zfv" (UniqueName: "kubernetes.io/projected/9ef7ce93-3db8-4309-9718-589cf8396dca-kube-api-access-m9zfv") pod "calico-typha-585bb46969-jdpjr" (UID: "9ef7ce93-3db8-4309-9718-589cf8396dca") : failed to sync configmap cache: timed out waiting for the condition Sep 12 22:08:26.180586 kubelet[3235]: E0912 22:08:26.180547 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.180869 kubelet[3235]: W0912 22:08:26.180836 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.181069 kubelet[3235]: E0912 22:08:26.180999 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.243394 kubelet[3235]: E0912 22:08:26.243332 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.243394 kubelet[3235]: W0912 22:08:26.243374 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.243586 kubelet[3235]: E0912 22:08:26.243409 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.244036 kubelet[3235]: E0912 22:08:26.243958 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.244036 kubelet[3235]: W0912 22:08:26.244016 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.244308 kubelet[3235]: E0912 22:08:26.244044 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.244630 kubelet[3235]: E0912 22:08:26.244556 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.244630 kubelet[3235]: W0912 22:08:26.244612 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.244749 kubelet[3235]: E0912 22:08:26.244638 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.245153 kubelet[3235]: E0912 22:08:26.245083 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.245230 kubelet[3235]: W0912 22:08:26.245152 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.245230 kubelet[3235]: E0912 22:08:26.245180 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.346885 kubelet[3235]: E0912 22:08:26.346661 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.346885 kubelet[3235]: W0912 22:08:26.346693 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.346885 kubelet[3235]: E0912 22:08:26.346721 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.347264 kubelet[3235]: E0912 22:08:26.347241 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.347576 kubelet[3235]: W0912 22:08:26.347367 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.347576 kubelet[3235]: E0912 22:08:26.347397 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.347807 kubelet[3235]: E0912 22:08:26.347788 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.348061 kubelet[3235]: W0912 22:08:26.347881 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.348061 kubelet[3235]: E0912 22:08:26.347907 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.348330 kubelet[3235]: E0912 22:08:26.348275 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.348424 kubelet[3235]: W0912 22:08:26.348404 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.348588 kubelet[3235]: E0912 22:08:26.348514 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.370699 kubelet[3235]: E0912 22:08:26.369209 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.370699 kubelet[3235]: W0912 22:08:26.369249 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.370699 kubelet[3235]: E0912 22:08:26.369281 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.379277 kubelet[3235]: E0912 22:08:26.379229 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.379277 kubelet[3235]: W0912 22:08:26.379267 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.379494 kubelet[3235]: E0912 22:08:26.379298 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.450643 kubelet[3235]: E0912 22:08:26.449322 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.450643 kubelet[3235]: W0912 22:08:26.449358 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.450643 kubelet[3235]: E0912 22:08:26.449388 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.450643 kubelet[3235]: E0912 22:08:26.449751 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.450643 kubelet[3235]: W0912 22:08:26.449767 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.450643 kubelet[3235]: E0912 22:08:26.449786 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.457874 containerd[1931]: time="2025-09-12T22:08:26.457807290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qpcp8,Uid:e334e072-c69d-4c1e-aa37-4e31e49dfbe6,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:26.501799 containerd[1931]: time="2025-09-12T22:08:26.501542215Z" level=info msg="connecting to shim 9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41" address="unix:///run/containerd/s/9c2cdc7feefd01211528a8c678cf3242f1231636352a568be93eedd957648abe" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:26.552710 kubelet[3235]: E0912 22:08:26.551861 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.552710 kubelet[3235]: W0912 22:08:26.552324 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.552710 kubelet[3235]: E0912 22:08:26.552361 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.554389 kubelet[3235]: E0912 22:08:26.554355 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.555000 kubelet[3235]: W0912 22:08:26.554819 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.555000 kubelet[3235]: E0912 22:08:26.554870 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.585434 systemd[1]: Started cri-containerd-9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41.scope - libcontainer container 9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41. Sep 12 22:08:26.656364 kubelet[3235]: E0912 22:08:26.656258 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.656364 kubelet[3235]: W0912 22:08:26.656290 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.656364 kubelet[3235]: E0912 22:08:26.656319 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.657271 kubelet[3235]: E0912 22:08:26.657189 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.657271 kubelet[3235]: W0912 22:08:26.657238 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.657625 kubelet[3235]: E0912 22:08:26.657484 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.658136 kubelet[3235]: E0912 22:08:26.657990 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.658136 kubelet[3235]: W0912 22:08:26.658042 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.658136 kubelet[3235]: E0912 22:08:26.658068 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.658973 kubelet[3235]: E0912 22:08:26.658766 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.658973 kubelet[3235]: W0912 22:08:26.658806 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.658973 kubelet[3235]: E0912 22:08:26.658833 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.659412 kubelet[3235]: E0912 22:08:26.659350 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.659412 kubelet[3235]: W0912 22:08:26.659374 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.659656 kubelet[3235]: E0912 22:08:26.659578 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.660255 kubelet[3235]: E0912 22:08:26.660210 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.660524 kubelet[3235]: W0912 22:08:26.660452 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.660524 kubelet[3235]: E0912 22:08:26.660487 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.662594 kubelet[3235]: E0912 22:08:26.662556 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.662847 kubelet[3235]: W0912 22:08:26.662757 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.662847 kubelet[3235]: E0912 22:08:26.662797 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.668400 containerd[1931]: time="2025-09-12T22:08:26.668263555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qpcp8,Uid:e334e072-c69d-4c1e-aa37-4e31e49dfbe6,Namespace:calico-system,Attempt:0,} returns sandbox id \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\"" Sep 12 22:08:26.673256 containerd[1931]: time="2025-09-12T22:08:26.673191343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:08:26.760482 kubelet[3235]: E0912 22:08:26.760078 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.760482 kubelet[3235]: W0912 22:08:26.760147 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.760482 kubelet[3235]: E0912 22:08:26.760198 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.761048 kubelet[3235]: E0912 22:08:26.761017 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.761276 kubelet[3235]: W0912 22:08:26.761203 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.761276 kubelet[3235]: E0912 22:08:26.761241 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.762170 kubelet[3235]: E0912 22:08:26.762078 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.762420 kubelet[3235]: W0912 22:08:26.762269 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.762420 kubelet[3235]: E0912 22:08:26.762305 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.763076 kubelet[3235]: E0912 22:08:26.763052 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.763323 kubelet[3235]: W0912 22:08:26.763249 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.763323 kubelet[3235]: E0912 22:08:26.763279 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.763830 kubelet[3235]: E0912 22:08:26.763810 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.764015 kubelet[3235]: W0912 22:08:26.763940 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.764015 kubelet[3235]: E0912 22:08:26.763969 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.780339 kubelet[3235]: E0912 22:08:26.780289 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:08:26.780339 kubelet[3235]: W0912 22:08:26.780328 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:08:26.780563 kubelet[3235]: E0912 22:08:26.780362 3235 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:08:26.813201 containerd[1931]: time="2025-09-12T22:08:26.813131672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-585bb46969-jdpjr,Uid:9ef7ce93-3db8-4309-9718-589cf8396dca,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:26.858709 containerd[1931]: time="2025-09-12T22:08:26.858608828Z" level=info msg="connecting to shim 8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89" address="unix:///run/containerd/s/8f335e7ba54780aff10136481f357f27b32cf62e5b0b37eb044fa5754ef733a9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:26.918915 systemd[1]: Started cri-containerd-8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89.scope - libcontainer container 8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89. Sep 12 22:08:27.070249 containerd[1931]: time="2025-09-12T22:08:27.069731513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-585bb46969-jdpjr,Uid:9ef7ce93-3db8-4309-9718-589cf8396dca,Namespace:calico-system,Attempt:0,} returns sandbox id \"8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89\"" Sep 12 22:08:27.152513 kubelet[3235]: E0912 22:08:27.152449 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:28.051746 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1171069677.mount: Deactivated successfully. Sep 12 22:08:28.196207 containerd[1931]: time="2025-09-12T22:08:28.195244435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:28.197452 containerd[1931]: time="2025-09-12T22:08:28.197407987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=5636193" Sep 12 22:08:28.199711 containerd[1931]: time="2025-09-12T22:08:28.199662331Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:28.204892 containerd[1931]: time="2025-09-12T22:08:28.204807235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:28.206647 containerd[1931]: time="2025-09-12T22:08:28.206379199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.5330935s" Sep 12 22:08:28.206647 containerd[1931]: time="2025-09-12T22:08:28.206440231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 22:08:28.208541 containerd[1931]: time="2025-09-12T22:08:28.208491787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:08:28.213205 containerd[1931]: time="2025-09-12T22:08:28.211886935Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:08:28.237459 containerd[1931]: time="2025-09-12T22:08:28.237347515Z" level=info msg="Container 77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:28.247317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1705549312.mount: Deactivated successfully. Sep 12 22:08:28.269086 containerd[1931]: time="2025-09-12T22:08:28.269006251Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\"" Sep 12 22:08:28.270407 containerd[1931]: time="2025-09-12T22:08:28.270339079Z" level=info msg="StartContainer for \"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\"" Sep 12 22:08:28.274259 containerd[1931]: time="2025-09-12T22:08:28.274199323Z" level=info msg="connecting to shim 77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3" address="unix:///run/containerd/s/9c2cdc7feefd01211528a8c678cf3242f1231636352a568be93eedd957648abe" protocol=ttrpc version=3 Sep 12 22:08:28.314469 systemd[1]: Started cri-containerd-77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3.scope - libcontainer container 77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3. Sep 12 22:08:28.408159 containerd[1931]: time="2025-09-12T22:08:28.408004700Z" level=info msg="StartContainer for \"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\" returns successfully" Sep 12 22:08:28.433488 systemd[1]: cri-containerd-77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3.scope: Deactivated successfully. Sep 12 22:08:28.442619 containerd[1931]: time="2025-09-12T22:08:28.442023020Z" level=info msg="received exit event container_id:\"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\" id:\"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\" pid:4156 exited_at:{seconds:1757714908 nanos:441489236}" Sep 12 22:08:28.443339 containerd[1931]: time="2025-09-12T22:08:28.442450424Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\" id:\"77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3\" pid:4156 exited_at:{seconds:1757714908 nanos:441489236}" Sep 12 22:08:28.511805 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-77535d7646ee50ec0785ec2377ddf6741df16835daeaf8e062f31229399302d3-rootfs.mount: Deactivated successfully. Sep 12 22:08:29.152677 kubelet[3235]: E0912 22:08:29.152590 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:30.829576 containerd[1931]: time="2025-09-12T22:08:30.829441260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:30.830946 containerd[1931]: time="2025-09-12T22:08:30.830812284Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=31736396" Sep 12 22:08:30.831871 containerd[1931]: time="2025-09-12T22:08:30.831815748Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:30.836311 containerd[1931]: time="2025-09-12T22:08:30.836229240Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:30.837595 containerd[1931]: time="2025-09-12T22:08:30.837413568Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.628864973s" Sep 12 22:08:30.837595 containerd[1931]: time="2025-09-12T22:08:30.837467796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 22:08:30.839882 containerd[1931]: time="2025-09-12T22:08:30.839547720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:08:30.866327 containerd[1931]: time="2025-09-12T22:08:30.866278548Z" level=info msg="CreateContainer within sandbox \"8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:08:30.889060 containerd[1931]: time="2025-09-12T22:08:30.888993216Z" level=info msg="Container a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:30.908472 containerd[1931]: time="2025-09-12T22:08:30.908392872Z" level=info msg="CreateContainer within sandbox \"8824108e34411c45f1dd102672363179beecc8f64b957c5efef8e0dd107bcb89\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204\"" Sep 12 22:08:30.909820 containerd[1931]: time="2025-09-12T22:08:30.909689292Z" level=info msg="StartContainer for \"a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204\"" Sep 12 22:08:30.912569 containerd[1931]: time="2025-09-12T22:08:30.912494220Z" level=info msg="connecting to shim a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204" address="unix:///run/containerd/s/8f335e7ba54780aff10136481f357f27b32cf62e5b0b37eb044fa5754ef733a9" protocol=ttrpc version=3 Sep 12 22:08:30.956434 systemd[1]: Started cri-containerd-a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204.scope - libcontainer container a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204. Sep 12 22:08:31.043742 containerd[1931]: time="2025-09-12T22:08:31.043676769Z" level=info msg="StartContainer for \"a98e4fbdeb5aaa86e95d64d3fe56e11ea26ba05edb2bfd54b91c6b5e768e5204\" returns successfully" Sep 12 22:08:31.153440 kubelet[3235]: E0912 22:08:31.152901 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:32.347752 kubelet[3235]: I0912 22:08:32.347401 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:08:33.153404 kubelet[3235]: E0912 22:08:33.153084 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:34.435969 containerd[1931]: time="2025-09-12T22:08:34.435905510Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 22:08:34.437653 containerd[1931]: time="2025-09-12T22:08:34.437555642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:34.441426 containerd[1931]: time="2025-09-12T22:08:34.441222626Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:34.443574 containerd[1931]: time="2025-09-12T22:08:34.442797878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:34.444413 containerd[1931]: time="2025-09-12T22:08:34.444364826Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.604757418s" Sep 12 22:08:34.444544 containerd[1931]: time="2025-09-12T22:08:34.444514658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 22:08:34.449430 containerd[1931]: time="2025-09-12T22:08:34.449207570Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:08:34.467139 containerd[1931]: time="2025-09-12T22:08:34.464405582Z" level=info msg="Container 5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:34.484499 containerd[1931]: time="2025-09-12T22:08:34.484425002Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\"" Sep 12 22:08:34.485769 containerd[1931]: time="2025-09-12T22:08:34.485701682Z" level=info msg="StartContainer for \"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\"" Sep 12 22:08:34.490830 containerd[1931]: time="2025-09-12T22:08:34.490771250Z" level=info msg="connecting to shim 5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892" address="unix:///run/containerd/s/9c2cdc7feefd01211528a8c678cf3242f1231636352a568be93eedd957648abe" protocol=ttrpc version=3 Sep 12 22:08:34.531414 systemd[1]: Started cri-containerd-5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892.scope - libcontainer container 5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892. Sep 12 22:08:34.621365 containerd[1931]: time="2025-09-12T22:08:34.621210891Z" level=info msg="StartContainer for \"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\" returns successfully" Sep 12 22:08:35.153054 kubelet[3235]: E0912 22:08:35.152977 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:35.407081 kubelet[3235]: I0912 22:08:35.406876 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-585bb46969-jdpjr" podStartSLOduration=7.6413098680000004 podStartE2EDuration="11.406853031s" podCreationTimestamp="2025-09-12 22:08:24 +0000 UTC" firstStartedPulling="2025-09-12 22:08:27.073467305 +0000 UTC m=+29.219130758" lastFinishedPulling="2025-09-12 22:08:30.839010384 +0000 UTC m=+32.984673921" observedRunningTime="2025-09-12 22:08:31.440547743 +0000 UTC m=+33.586211220" watchObservedRunningTime="2025-09-12 22:08:35.406853031 +0000 UTC m=+37.552516472" Sep 12 22:08:35.529284 containerd[1931]: time="2025-09-12T22:08:35.528934239Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:08:35.534566 systemd[1]: cri-containerd-5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892.scope: Deactivated successfully. Sep 12 22:08:35.535269 systemd[1]: cri-containerd-5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892.scope: Consumed 888ms CPU time, 185.6M memory peak, 165.8M written to disk. Sep 12 22:08:35.541334 containerd[1931]: time="2025-09-12T22:08:35.541273347Z" level=info msg="received exit event container_id:\"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\" id:\"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\" pid:4260 exited_at:{seconds:1757714915 nanos:539636895}" Sep 12 22:08:35.542657 containerd[1931]: time="2025-09-12T22:08:35.542565603Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\" id:\"5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892\" pid:4260 exited_at:{seconds:1757714915 nanos:539636895}" Sep 12 22:08:35.552434 kubelet[3235]: I0912 22:08:35.552370 3235 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 22:08:35.609621 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5095c672ff66ac5c3fbac5cf5a0e9c8fcb89df2a00297ea2bc8db08d7d978892-rootfs.mount: Deactivated successfully. Sep 12 22:08:35.665984 systemd[1]: Created slice kubepods-burstable-pod73e2ed8d_89f7_42c2_9717_a3369e2dbf0b.slice - libcontainer container kubepods-burstable-pod73e2ed8d_89f7_42c2_9717_a3369e2dbf0b.slice. Sep 12 22:08:35.680890 kubelet[3235]: W0912 22:08:35.679489 3235 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:35.687355 kubelet[3235]: E0912 22:08:35.687046 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:35.687355 kubelet[3235]: W0912 22:08:35.682089 3235 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:35.687355 kubelet[3235]: E0912 22:08:35.687163 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:35.702554 systemd[1]: Created slice kubepods-burstable-pod79d77081_eb0e_4e25_a57d_0f78364fbfa6.slice - libcontainer container kubepods-burstable-pod79d77081_eb0e_4e25_a57d_0f78364fbfa6.slice. Sep 12 22:08:35.726298 kubelet[3235]: I0912 22:08:35.725839 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79d77081-eb0e-4e25-a57d-0f78364fbfa6-config-volume\") pod \"coredns-7c65d6cfc9-lbgt4\" (UID: \"79d77081-eb0e-4e25-a57d-0f78364fbfa6\") " pod="kube-system/coredns-7c65d6cfc9-lbgt4" Sep 12 22:08:35.726298 kubelet[3235]: I0912 22:08:35.726053 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25vfx\" (UniqueName: \"kubernetes.io/projected/79d77081-eb0e-4e25-a57d-0f78364fbfa6-kube-api-access-25vfx\") pod \"coredns-7c65d6cfc9-lbgt4\" (UID: \"79d77081-eb0e-4e25-a57d-0f78364fbfa6\") " pod="kube-system/coredns-7c65d6cfc9-lbgt4" Sep 12 22:08:35.726298 kubelet[3235]: I0912 22:08:35.726101 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e2ed8d-89f7-42c2-9717-a3369e2dbf0b-config-volume\") pod \"coredns-7c65d6cfc9-vl2mt\" (UID: \"73e2ed8d-89f7-42c2-9717-a3369e2dbf0b\") " pod="kube-system/coredns-7c65d6cfc9-vl2mt" Sep 12 22:08:35.726298 kubelet[3235]: I0912 22:08:35.726244 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86vj\" (UniqueName: \"kubernetes.io/projected/73e2ed8d-89f7-42c2-9717-a3369e2dbf0b-kube-api-access-w86vj\") pod \"coredns-7c65d6cfc9-vl2mt\" (UID: \"73e2ed8d-89f7-42c2-9717-a3369e2dbf0b\") " pod="kube-system/coredns-7c65d6cfc9-vl2mt" Sep 12 22:08:35.739909 systemd[1]: Created slice kubepods-besteffort-pod6945a7e6_ff59_498b_a0da_e2121447b792.slice - libcontainer container kubepods-besteffort-pod6945a7e6_ff59_498b_a0da_e2121447b792.slice. Sep 12 22:08:35.764476 systemd[1]: Created slice kubepods-besteffort-pod39f9ed23_d9e9_4acb_bfad_64b584e9c682.slice - libcontainer container kubepods-besteffort-pod39f9ed23_d9e9_4acb_bfad_64b584e9c682.slice. Sep 12 22:08:35.790454 systemd[1]: Created slice kubepods-besteffort-pod2c50c08b_629a_439f_b53f_ab344aac6ceb.slice - libcontainer container kubepods-besteffort-pod2c50c08b_629a_439f_b53f_ab344aac6ceb.slice. Sep 12 22:08:35.810473 systemd[1]: Created slice kubepods-besteffort-podee3e10d8_3f92_4d84_802c_035178599eb9.slice - libcontainer container kubepods-besteffort-podee3e10d8_3f92_4d84_802c_035178599eb9.slice. Sep 12 22:08:35.826705 kubelet[3235]: I0912 22:08:35.826566 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gzv\" (UniqueName: \"kubernetes.io/projected/ee3e10d8-3f92-4d84-802c-035178599eb9-kube-api-access-n7gzv\") pod \"goldmane-7988f88666-htbqv\" (UID: \"ee3e10d8-3f92-4d84-802c-035178599eb9\") " pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:35.827523 kubelet[3235]: I0912 22:08:35.827189 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39f9ed23-d9e9-4acb-bfad-64b584e9c682-tigera-ca-bundle\") pod \"calico-kube-controllers-68c6947c97-2lbtl\" (UID: \"39f9ed23-d9e9-4acb-bfad-64b584e9c682\") " pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" Sep 12 22:08:35.828339 kubelet[3235]: I0912 22:08:35.828301 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee3e10d8-3f92-4d84-802c-035178599eb9-config\") pod \"goldmane-7988f88666-htbqv\" (UID: \"ee3e10d8-3f92-4d84-802c-035178599eb9\") " pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:35.828645 systemd[1]: Created slice kubepods-besteffort-pod30c5d4d2_8264_4587_95b9_fdd641501e94.slice - libcontainer container kubepods-besteffort-pod30c5d4d2_8264_4587_95b9_fdd641501e94.slice. Sep 12 22:08:35.831775 kubelet[3235]: I0912 22:08:35.831698 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-ca-bundle\") pod \"whisker-76858f948b-g6mc2\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " pod="calico-system/whisker-76858f948b-g6mc2" Sep 12 22:08:35.831923 kubelet[3235]: I0912 22:08:35.831841 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bq6f\" (UniqueName: \"kubernetes.io/projected/2c50c08b-629a-439f-b53f-ab344aac6ceb-kube-api-access-2bq6f\") pod \"whisker-76858f948b-g6mc2\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " pod="calico-system/whisker-76858f948b-g6mc2" Sep 12 22:08:35.831986 kubelet[3235]: I0912 22:08:35.831947 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee3e10d8-3f92-4d84-802c-035178599eb9-goldmane-ca-bundle\") pod \"goldmane-7988f88666-htbqv\" (UID: \"ee3e10d8-3f92-4d84-802c-035178599eb9\") " pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:35.832044 kubelet[3235]: I0912 22:08:35.832023 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-backend-key-pair\") pod \"whisker-76858f948b-g6mc2\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " pod="calico-system/whisker-76858f948b-g6mc2" Sep 12 22:08:35.835181 kubelet[3235]: I0912 22:08:35.832063 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6945a7e6-ff59-498b-a0da-e2121447b792-calico-apiserver-certs\") pod \"calico-apiserver-f6c548746-24zd8\" (UID: \"6945a7e6-ff59-498b-a0da-e2121447b792\") " pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" Sep 12 22:08:35.835181 kubelet[3235]: I0912 22:08:35.832159 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/30c5d4d2-8264-4587-95b9-fdd641501e94-calico-apiserver-certs\") pod \"calico-apiserver-f6c548746-mwv4c\" (UID: \"30c5d4d2-8264-4587-95b9-fdd641501e94\") " pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" Sep 12 22:08:35.835181 kubelet[3235]: I0912 22:08:35.832253 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ee3e10d8-3f92-4d84-802c-035178599eb9-goldmane-key-pair\") pod \"goldmane-7988f88666-htbqv\" (UID: \"ee3e10d8-3f92-4d84-802c-035178599eb9\") " pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:35.835181 kubelet[3235]: I0912 22:08:35.832296 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxh4f\" (UniqueName: \"kubernetes.io/projected/6945a7e6-ff59-498b-a0da-e2121447b792-kube-api-access-dxh4f\") pod \"calico-apiserver-f6c548746-24zd8\" (UID: \"6945a7e6-ff59-498b-a0da-e2121447b792\") " pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" Sep 12 22:08:35.835181 kubelet[3235]: I0912 22:08:35.832375 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjnjf\" (UniqueName: \"kubernetes.io/projected/39f9ed23-d9e9-4acb-bfad-64b584e9c682-kube-api-access-mjnjf\") pod \"calico-kube-controllers-68c6947c97-2lbtl\" (UID: \"39f9ed23-d9e9-4acb-bfad-64b584e9c682\") " pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" Sep 12 22:08:35.835534 kubelet[3235]: I0912 22:08:35.832460 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w92hk\" (UniqueName: \"kubernetes.io/projected/30c5d4d2-8264-4587-95b9-fdd641501e94-kube-api-access-w92hk\") pod \"calico-apiserver-f6c548746-mwv4c\" (UID: \"30c5d4d2-8264-4587-95b9-fdd641501e94\") " pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" Sep 12 22:08:35.996891 containerd[1931]: time="2025-09-12T22:08:35.996736686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vl2mt,Uid:73e2ed8d-89f7-42c2-9717-a3369e2dbf0b,Namespace:kube-system,Attempt:0,}" Sep 12 22:08:36.017186 containerd[1931]: time="2025-09-12T22:08:36.016874702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lbgt4,Uid:79d77081-eb0e-4e25-a57d-0f78364fbfa6,Namespace:kube-system,Attempt:0,}" Sep 12 22:08:36.076489 containerd[1931]: time="2025-09-12T22:08:36.076437986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c6947c97-2lbtl,Uid:39f9ed23-d9e9-4acb-bfad-64b584e9c682,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:36.102160 containerd[1931]: time="2025-09-12T22:08:36.102084230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76858f948b-g6mc2,Uid:2c50c08b-629a-439f-b53f-ab344aac6ceb,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:36.120854 containerd[1931]: time="2025-09-12T22:08:36.120730946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-htbqv,Uid:ee3e10d8-3f92-4d84-802c-035178599eb9,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:36.414913 containerd[1931]: time="2025-09-12T22:08:36.414631108Z" level=error msg="Failed to destroy network for sandbox \"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.734607 containerd[1931]: time="2025-09-12T22:08:36.734509229Z" level=error msg="Failed to destroy network for sandbox \"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.738701 systemd[1]: run-netns-cni\x2dacba9ec2\x2d7f13\x2d8592\x2d6f02\x2ddfce4522a97c.mount: Deactivated successfully. Sep 12 22:08:36.787008 containerd[1931]: time="2025-09-12T22:08:36.786697374Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vl2mt,Uid:73e2ed8d-89f7-42c2-9717-a3369e2dbf0b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.790088 kubelet[3235]: E0912 22:08:36.790010 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.791506 kubelet[3235]: E0912 22:08:36.791437 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vl2mt" Sep 12 22:08:36.791598 kubelet[3235]: E0912 22:08:36.791509 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vl2mt" Sep 12 22:08:36.791658 kubelet[3235]: E0912 22:08:36.791584 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-vl2mt_kube-system(73e2ed8d-89f7-42c2-9717-a3369e2dbf0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-vl2mt_kube-system(73e2ed8d-89f7-42c2-9717-a3369e2dbf0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"760736329b613f36cd46bb8b1785bae2e10e266561a700ffbcfa8b290cdbb477\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-vl2mt" podUID="73e2ed8d-89f7-42c2-9717-a3369e2dbf0b" Sep 12 22:08:36.814140 containerd[1931]: time="2025-09-12T22:08:36.811291158Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lbgt4,Uid:79d77081-eb0e-4e25-a57d-0f78364fbfa6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.818136 kubelet[3235]: E0912 22:08:36.816394 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.818136 kubelet[3235]: E0912 22:08:36.816472 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lbgt4" Sep 12 22:08:36.818136 kubelet[3235]: E0912 22:08:36.816504 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lbgt4" Sep 12 22:08:36.818396 kubelet[3235]: E0912 22:08:36.816562 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-lbgt4_kube-system(79d77081-eb0e-4e25-a57d-0f78364fbfa6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-lbgt4_kube-system(79d77081-eb0e-4e25-a57d-0f78364fbfa6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf6537e2f7cf026cf7e76b134c8378a1ebb602a984a6b8744d36641b39725e66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-lbgt4" podUID="79d77081-eb0e-4e25-a57d-0f78364fbfa6" Sep 12 22:08:36.899160 containerd[1931]: time="2025-09-12T22:08:36.898999878Z" level=error msg="Failed to destroy network for sandbox \"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.906378 containerd[1931]: time="2025-09-12T22:08:36.906282306Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c6947c97-2lbtl,Uid:39f9ed23-d9e9-4acb-bfad-64b584e9c682,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.906867 kubelet[3235]: E0912 22:08:36.906818 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.909603 kubelet[3235]: E0912 22:08:36.907095 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" Sep 12 22:08:36.909603 kubelet[3235]: E0912 22:08:36.907245 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" Sep 12 22:08:36.909603 kubelet[3235]: E0912 22:08:36.907327 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68c6947c97-2lbtl_calico-system(39f9ed23-d9e9-4acb-bfad-64b584e9c682)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68c6947c97-2lbtl_calico-system(39f9ed23-d9e9-4acb-bfad-64b584e9c682)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"251132b652d086fe6eca53c45dc7f7502eaf91ef5b04f499298f53e1fc45e268\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" podUID="39f9ed23-d9e9-4acb-bfad-64b584e9c682" Sep 12 22:08:36.958475 containerd[1931]: time="2025-09-12T22:08:36.958406886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-24zd8,Uid:6945a7e6-ff59-498b-a0da-e2121447b792,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:08:36.962598 containerd[1931]: time="2025-09-12T22:08:36.962509374Z" level=error msg="Failed to destroy network for sandbox \"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.965399 containerd[1931]: time="2025-09-12T22:08:36.965335794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-htbqv,Uid:ee3e10d8-3f92-4d84-802c-035178599eb9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.966005 kubelet[3235]: E0912 22:08:36.965896 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.966005 kubelet[3235]: E0912 22:08:36.965981 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:36.966363 kubelet[3235]: E0912 22:08:36.966020 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-htbqv" Sep 12 22:08:36.966873 kubelet[3235]: E0912 22:08:36.966705 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-htbqv_calico-system(ee3e10d8-3f92-4d84-802c-035178599eb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-htbqv_calico-system(ee3e10d8-3f92-4d84-802c-035178599eb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3beebf276bd4bc81af1cdd66c985301c2dba9ea114136a109096b420300f2af1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-htbqv" podUID="ee3e10d8-3f92-4d84-802c-035178599eb9" Sep 12 22:08:36.974458 containerd[1931]: time="2025-09-12T22:08:36.973641103Z" level=error msg="Failed to destroy network for sandbox \"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.978404 containerd[1931]: time="2025-09-12T22:08:36.978316639Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76858f948b-g6mc2,Uid:2c50c08b-629a-439f-b53f-ab344aac6ceb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.979044 kubelet[3235]: E0912 22:08:36.978979 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:36.979926 kubelet[3235]: E0912 22:08:36.979432 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76858f948b-g6mc2" Sep 12 22:08:36.979926 kubelet[3235]: E0912 22:08:36.979478 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76858f948b-g6mc2" Sep 12 22:08:36.979926 kubelet[3235]: E0912 22:08:36.979561 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76858f948b-g6mc2_calico-system(2c50c08b-629a-439f-b53f-ab344aac6ceb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76858f948b-g6mc2_calico-system(2c50c08b-629a-439f-b53f-ab344aac6ceb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8ddfc5656712adcb2af148269e9b56da9cdd3d8b7a20db485bf02486505d9ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76858f948b-g6mc2" podUID="2c50c08b-629a-439f-b53f-ab344aac6ceb" Sep 12 22:08:37.041772 containerd[1931]: time="2025-09-12T22:08:37.041358171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-mwv4c,Uid:30c5d4d2-8264-4587-95b9-fdd641501e94,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:08:37.066374 containerd[1931]: time="2025-09-12T22:08:37.066312795Z" level=error msg="Failed to destroy network for sandbox \"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.069818 containerd[1931]: time="2025-09-12T22:08:37.069135327Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-24zd8,Uid:6945a7e6-ff59-498b-a0da-e2121447b792,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.070935 kubelet[3235]: E0912 22:08:37.070860 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.071272 kubelet[3235]: E0912 22:08:37.070986 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" Sep 12 22:08:37.071272 kubelet[3235]: E0912 22:08:37.071043 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" Sep 12 22:08:37.071272 kubelet[3235]: E0912 22:08:37.071171 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f6c548746-24zd8_calico-apiserver(6945a7e6-ff59-498b-a0da-e2121447b792)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f6c548746-24zd8_calico-apiserver(6945a7e6-ff59-498b-a0da-e2121447b792)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fce1d4e5129ccedc315a10b9a9bb1dc7697412075b0f9d9bacd033f0dc8226ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" podUID="6945a7e6-ff59-498b-a0da-e2121447b792" Sep 12 22:08:37.135997 containerd[1931]: time="2025-09-12T22:08:37.135819435Z" level=error msg="Failed to destroy network for sandbox \"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.138619 containerd[1931]: time="2025-09-12T22:08:37.138495663Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-mwv4c,Uid:30c5d4d2-8264-4587-95b9-fdd641501e94,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.139147 kubelet[3235]: E0912 22:08:37.139040 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.139255 kubelet[3235]: E0912 22:08:37.139190 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" Sep 12 22:08:37.139317 kubelet[3235]: E0912 22:08:37.139247 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" Sep 12 22:08:37.139405 kubelet[3235]: E0912 22:08:37.139349 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f6c548746-mwv4c_calico-apiserver(30c5d4d2-8264-4587-95b9-fdd641501e94)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f6c548746-mwv4c_calico-apiserver(30c5d4d2-8264-4587-95b9-fdd641501e94)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"53a14089150f04a3976d8a51be49425f09cfea97f2c3afe1156f6e2700013690\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" podUID="30c5d4d2-8264-4587-95b9-fdd641501e94" Sep 12 22:08:37.163913 systemd[1]: Created slice kubepods-besteffort-pod746510a2_33f4_43d1_a7e4_66416d6cce62.slice - libcontainer container kubepods-besteffort-pod746510a2_33f4_43d1_a7e4_66416d6cce62.slice. Sep 12 22:08:37.168931 containerd[1931]: time="2025-09-12T22:08:37.168579748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-npwnh,Uid:746510a2-33f4-43d1-a7e4-66416d6cce62,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:37.254588 containerd[1931]: time="2025-09-12T22:08:37.254485264Z" level=error msg="Failed to destroy network for sandbox \"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.257422 containerd[1931]: time="2025-09-12T22:08:37.257335456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-npwnh,Uid:746510a2-33f4-43d1-a7e4-66416d6cce62,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.257641 kubelet[3235]: E0912 22:08:37.257593 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:08:37.257712 kubelet[3235]: E0912 22:08:37.257666 3235 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:37.257712 kubelet[3235]: E0912 22:08:37.257697 3235 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-npwnh" Sep 12 22:08:37.257817 kubelet[3235]: E0912 22:08:37.257757 3235 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-npwnh_calico-system(746510a2-33f4-43d1-a7e4-66416d6cce62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-npwnh_calico-system(746510a2-33f4-43d1-a7e4-66416d6cce62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7c5ce07debe5cafb13e4b74ea3e829256ad8d001af1c721e610b79f57af03933\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-npwnh" podUID="746510a2-33f4-43d1-a7e4-66416d6cce62" Sep 12 22:08:37.382626 containerd[1931]: time="2025-09-12T22:08:37.381290921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:08:37.604245 systemd[1]: run-netns-cni\x2d2efea0ed\x2d7f7d\x2dd60c\x2d4a3e\x2da00da98a4807.mount: Deactivated successfully. Sep 12 22:08:37.604412 systemd[1]: run-netns-cni\x2d20e9ed4f\x2def28\x2da562\x2d64ce\x2d1c875168a37c.mount: Deactivated successfully. Sep 12 22:08:37.604531 systemd[1]: run-netns-cni\x2dafb4aac0\x2d8e50\x2de837\x2ddba9\x2d1a436f751660.mount: Deactivated successfully. Sep 12 22:08:44.966694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2403558046.mount: Deactivated successfully. Sep 12 22:08:45.020171 containerd[1931]: time="2025-09-12T22:08:45.019961843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:45.022235 containerd[1931]: time="2025-09-12T22:08:45.021511523Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 22:08:45.023656 containerd[1931]: time="2025-09-12T22:08:45.023592491Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:45.028595 containerd[1931]: time="2025-09-12T22:08:45.028532675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:45.030847 containerd[1931]: time="2025-09-12T22:08:45.030785663Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 7.649426438s" Sep 12 22:08:45.030982 containerd[1931]: time="2025-09-12T22:08:45.030852179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 22:08:45.056761 containerd[1931]: time="2025-09-12T22:08:45.056704451Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:08:45.073649 containerd[1931]: time="2025-09-12T22:08:45.073564991Z" level=info msg="Container 047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:45.080685 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1310344978.mount: Deactivated successfully. Sep 12 22:08:45.095708 containerd[1931]: time="2025-09-12T22:08:45.095653127Z" level=info msg="CreateContainer within sandbox \"9af60fc08358bb67bdd0ea67254e158c8faf2282c76167129b87984fb5baba41\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\"" Sep 12 22:08:45.097060 containerd[1931]: time="2025-09-12T22:08:45.096998567Z" level=info msg="StartContainer for \"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\"" Sep 12 22:08:45.100702 containerd[1931]: time="2025-09-12T22:08:45.100554899Z" level=info msg="connecting to shim 047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07" address="unix:///run/containerd/s/9c2cdc7feefd01211528a8c678cf3242f1231636352a568be93eedd957648abe" protocol=ttrpc version=3 Sep 12 22:08:45.169394 systemd[1]: Started cri-containerd-047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07.scope - libcontainer container 047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07. Sep 12 22:08:45.275559 containerd[1931]: time="2025-09-12T22:08:45.275349336Z" level=info msg="StartContainer for \"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" returns successfully" Sep 12 22:08:45.638201 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:08:45.638331 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:08:45.652004 containerd[1931]: time="2025-09-12T22:08:45.651482882Z" level=info msg="TaskExit event in podsandbox handler container_id:\"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" id:\"37c67023cf1e38778178601a3c9f489381eb8d58a0bf95439ae96cbeacbfd25d\" pid:4559 exit_status:1 exited_at:{seconds:1757714925 nanos:650722082}" Sep 12 22:08:45.695134 kubelet[3235]: I0912 22:08:45.694536 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:08:45.727769 kubelet[3235]: I0912 22:08:45.727674 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-qpcp8" podStartSLOduration=2.367200582 podStartE2EDuration="20.72764429s" podCreationTimestamp="2025-09-12 22:08:25 +0000 UTC" firstStartedPulling="2025-09-12 22:08:26.671504035 +0000 UTC m=+28.817167488" lastFinishedPulling="2025-09-12 22:08:45.031947743 +0000 UTC m=+47.177611196" observedRunningTime="2025-09-12 22:08:45.466955953 +0000 UTC m=+47.612619430" watchObservedRunningTime="2025-09-12 22:08:45.72764429 +0000 UTC m=+47.873307743" Sep 12 22:08:46.012098 kubelet[3235]: I0912 22:08:46.012027 3235 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-ca-bundle\") pod \"2c50c08b-629a-439f-b53f-ab344aac6ceb\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " Sep 12 22:08:46.012289 kubelet[3235]: I0912 22:08:46.012104 3235 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bq6f\" (UniqueName: \"kubernetes.io/projected/2c50c08b-629a-439f-b53f-ab344aac6ceb-kube-api-access-2bq6f\") pod \"2c50c08b-629a-439f-b53f-ab344aac6ceb\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " Sep 12 22:08:46.012289 kubelet[3235]: I0912 22:08:46.012177 3235 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-backend-key-pair\") pod \"2c50c08b-629a-439f-b53f-ab344aac6ceb\" (UID: \"2c50c08b-629a-439f-b53f-ab344aac6ceb\") " Sep 12 22:08:46.014985 kubelet[3235]: I0912 22:08:46.014865 3235 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2c50c08b-629a-439f-b53f-ab344aac6ceb" (UID: "2c50c08b-629a-439f-b53f-ab344aac6ceb"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 22:08:46.028871 kubelet[3235]: I0912 22:08:46.028415 3235 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2c50c08b-629a-439f-b53f-ab344aac6ceb" (UID: "2c50c08b-629a-439f-b53f-ab344aac6ceb"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 22:08:46.029745 systemd[1]: var-lib-kubelet-pods-2c50c08b\x2d629a\x2d439f\x2db53f\x2dab344aac6ceb-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:08:46.033936 kubelet[3235]: I0912 22:08:46.031400 3235 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c50c08b-629a-439f-b53f-ab344aac6ceb-kube-api-access-2bq6f" (OuterVolumeSpecName: "kube-api-access-2bq6f") pod "2c50c08b-629a-439f-b53f-ab344aac6ceb" (UID: "2c50c08b-629a-439f-b53f-ab344aac6ceb"). InnerVolumeSpecName "kube-api-access-2bq6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 22:08:46.037930 systemd[1]: var-lib-kubelet-pods-2c50c08b\x2d629a\x2d439f\x2db53f\x2dab344aac6ceb-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2bq6f.mount: Deactivated successfully. Sep 12 22:08:46.113317 kubelet[3235]: I0912 22:08:46.113208 3235 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-ca-bundle\") on node \"ip-172-31-25-121\" DevicePath \"\"" Sep 12 22:08:46.113317 kubelet[3235]: I0912 22:08:46.113256 3235 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bq6f\" (UniqueName: \"kubernetes.io/projected/2c50c08b-629a-439f-b53f-ab344aac6ceb-kube-api-access-2bq6f\") on node \"ip-172-31-25-121\" DevicePath \"\"" Sep 12 22:08:46.113317 kubelet[3235]: I0912 22:08:46.113280 3235 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2c50c08b-629a-439f-b53f-ab344aac6ceb-whisker-backend-key-pair\") on node \"ip-172-31-25-121\" DevicePath \"\"" Sep 12 22:08:46.176681 systemd[1]: Removed slice kubepods-besteffort-pod2c50c08b_629a_439f_b53f_ab344aac6ceb.slice - libcontainer container kubepods-besteffort-pod2c50c08b_629a_439f_b53f_ab344aac6ceb.slice. Sep 12 22:08:46.558711 systemd[1]: Created slice kubepods-besteffort-poda74c9e07_5bfc_42f4_9aec_0c22d7bd3828.slice - libcontainer container kubepods-besteffort-poda74c9e07_5bfc_42f4_9aec_0c22d7bd3828.slice. Sep 12 22:08:46.559267 kubelet[3235]: W0912 22:08:46.558983 3235 reflector.go:561] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:46.559267 kubelet[3235]: E0912 22:08:46.559040 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:46.562496 kubelet[3235]: W0912 22:08:46.562338 3235 reflector.go:561] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ip-172-31-25-121" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-25-121' and this object Sep 12 22:08:46.562496 kubelet[3235]: E0912 22:08:46.562409 3235 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ip-172-31-25-121\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-25-121' and this object" logger="UnhandledError" Sep 12 22:08:46.710552 containerd[1931]: time="2025-09-12T22:08:46.710490555Z" level=info msg="TaskExit event in podsandbox handler container_id:\"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" id:\"963afc3fb1086586d7ca7c5cc7415ccb669d687de86c59151adfbc2af2f78a9d\" pid:4617 exit_status:1 exited_at:{seconds:1757714926 nanos:710003979}" Sep 12 22:08:46.717405 kubelet[3235]: I0912 22:08:46.717181 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvbh\" (UniqueName: \"kubernetes.io/projected/a74c9e07-5bfc-42f4-9aec-0c22d7bd3828-kube-api-access-cqvbh\") pod \"whisker-84bc8f6f49-mc9wd\" (UID: \"a74c9e07-5bfc-42f4-9aec-0c22d7bd3828\") " pod="calico-system/whisker-84bc8f6f49-mc9wd" Sep 12 22:08:46.717405 kubelet[3235]: I0912 22:08:46.717263 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74c9e07-5bfc-42f4-9aec-0c22d7bd3828-whisker-ca-bundle\") pod \"whisker-84bc8f6f49-mc9wd\" (UID: \"a74c9e07-5bfc-42f4-9aec-0c22d7bd3828\") " pod="calico-system/whisker-84bc8f6f49-mc9wd" Sep 12 22:08:46.717405 kubelet[3235]: I0912 22:08:46.717309 3235 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a74c9e07-5bfc-42f4-9aec-0c22d7bd3828-whisker-backend-key-pair\") pod \"whisker-84bc8f6f49-mc9wd\" (UID: \"a74c9e07-5bfc-42f4-9aec-0c22d7bd3828\") " pod="calico-system/whisker-84bc8f6f49-mc9wd" Sep 12 22:08:47.153685 containerd[1931]: time="2025-09-12T22:08:47.153310549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-htbqv,Uid:ee3e10d8-3f92-4d84-802c-035178599eb9,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:47.560473 (udev-worker)[4572]: Network interface NamePolicy= disabled on kernel command line. Sep 12 22:08:47.570349 systemd-networkd[1811]: califa67c3b8e59: Link UP Sep 12 22:08:47.574104 systemd-networkd[1811]: califa67c3b8e59: Gained carrier Sep 12 22:08:47.625124 containerd[1931]: 2025-09-12 22:08:47.200 [INFO][4630] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:08:47.625124 containerd[1931]: 2025-09-12 22:08:47.277 [INFO][4630] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0 goldmane-7988f88666- calico-system ee3e10d8-3f92-4d84-802c-035178599eb9 814 0 2025-09-12 22:08:24 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-25-121 goldmane-7988f88666-htbqv eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califa67c3b8e59 [] [] }} ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-" Sep 12 22:08:47.625124 containerd[1931]: 2025-09-12 22:08:47.277 [INFO][4630] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.625124 containerd[1931]: 2025-09-12 22:08:47.435 [INFO][4641] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" HandleID="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Workload="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.438 [INFO][4641] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" HandleID="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Workload="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d9f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-25-121", "pod":"goldmane-7988f88666-htbqv", "timestamp":"2025-09-12 22:08:47.435060111 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.438 [INFO][4641] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.438 [INFO][4641] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.438 [INFO][4641] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.481 [INFO][4641] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" host="ip-172-31-25-121" Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.495 [INFO][4641] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.503 [INFO][4641] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.507 [INFO][4641] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:47.625712 containerd[1931]: 2025-09-12 22:08:47.511 [INFO][4641] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.511 [INFO][4641] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" host="ip-172-31-25-121" Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.514 [INFO][4641] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5 Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.521 [INFO][4641] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" host="ip-172-31-25-121" Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.533 [INFO][4641] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.65/26] block=192.168.65.64/26 handle="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" host="ip-172-31-25-121" Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.533 [INFO][4641] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.65/26] handle="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" host="ip-172-31-25-121" Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.533 [INFO][4641] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:47.627532 containerd[1931]: 2025-09-12 22:08:47.533 [INFO][4641] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.65/26] IPv6=[] ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" HandleID="k8s-pod-network.763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Workload="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.628533 containerd[1931]: 2025-09-12 22:08:47.543 [INFO][4630] cni-plugin/k8s.go 418: Populated endpoint ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ee3e10d8-3f92-4d84-802c-035178599eb9", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"goldmane-7988f88666-htbqv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.65.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califa67c3b8e59", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:47.628533 containerd[1931]: 2025-09-12 22:08:47.543 [INFO][4630] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.65/32] ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.629370 containerd[1931]: 2025-09-12 22:08:47.543 [INFO][4630] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califa67c3b8e59 ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.629370 containerd[1931]: 2025-09-12 22:08:47.575 [INFO][4630] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.629480 containerd[1931]: 2025-09-12 22:08:47.576 [INFO][4630] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ee3e10d8-3f92-4d84-802c-035178599eb9", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5", Pod:"goldmane-7988f88666-htbqv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.65.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califa67c3b8e59", MAC:"aa:55:9c:15:b5:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:47.629928 containerd[1931]: 2025-09-12 22:08:47.619 [INFO][4630] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" Namespace="calico-system" Pod="goldmane-7988f88666-htbqv" WorkloadEndpoint="ip--172--31--25--121-k8s-goldmane--7988f88666--htbqv-eth0" Sep 12 22:08:47.725643 containerd[1931]: time="2025-09-12T22:08:47.725024440Z" level=info msg="connecting to shim 763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5" address="unix:///run/containerd/s/0293489615e03a104e0c527129ba982946c719c305238a3bdc7500bb6cffb457" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:47.861891 systemd[1]: Started cri-containerd-763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5.scope - libcontainer container 763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5. Sep 12 22:08:48.056937 containerd[1931]: time="2025-09-12T22:08:48.056864234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-htbqv,Uid:ee3e10d8-3f92-4d84-802c-035178599eb9,Namespace:calico-system,Attempt:0,} returns sandbox id \"763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5\"" Sep 12 22:08:48.063689 containerd[1931]: time="2025-09-12T22:08:48.063554894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:08:48.070459 containerd[1931]: time="2025-09-12T22:08:48.069633818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84bc8f6f49-mc9wd,Uid:a74c9e07-5bfc-42f4-9aec-0c22d7bd3828,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:48.157453 containerd[1931]: time="2025-09-12T22:08:48.156859706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-npwnh,Uid:746510a2-33f4-43d1-a7e4-66416d6cce62,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:48.190411 containerd[1931]: time="2025-09-12T22:08:48.190345682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lbgt4,Uid:79d77081-eb0e-4e25-a57d-0f78364fbfa6,Namespace:kube-system,Attempt:0,}" Sep 12 22:08:48.191391 containerd[1931]: time="2025-09-12T22:08:48.191165450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-mwv4c,Uid:30c5d4d2-8264-4587-95b9-fdd641501e94,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:08:48.198915 kubelet[3235]: I0912 22:08:48.197827 3235 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c50c08b-629a-439f-b53f-ab344aac6ceb" path="/var/lib/kubelet/pods/2c50c08b-629a-439f-b53f-ab344aac6ceb/volumes" Sep 12 22:08:48.837075 systemd-networkd[1811]: cali459b3829441: Link UP Sep 12 22:08:48.840240 systemd-networkd[1811]: cali459b3829441: Gained carrier Sep 12 22:08:48.904775 containerd[1931]: 2025-09-12 22:08:48.249 [INFO][4790] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:08:48.904775 containerd[1931]: 2025-09-12 22:08:48.335 [INFO][4790] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0 whisker-84bc8f6f49- calico-system a74c9e07-5bfc-42f4-9aec-0c22d7bd3828 896 0 2025-09-12 22:08:46 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:84bc8f6f49 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-25-121 whisker-84bc8f6f49-mc9wd eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali459b3829441 [] [] }} ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-" Sep 12 22:08:48.904775 containerd[1931]: 2025-09-12 22:08:48.336 [INFO][4790] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.904775 containerd[1931]: 2025-09-12 22:08:48.566 [INFO][4845] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" HandleID="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Workload="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.571 [INFO][4845] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" HandleID="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Workload="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004eefb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-25-121", "pod":"whisker-84bc8f6f49-mc9wd", "timestamp":"2025-09-12 22:08:48.566815828 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.572 [INFO][4845] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.572 [INFO][4845] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.573 [INFO][4845] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.615 [INFO][4845] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" host="ip-172-31-25-121" Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.662 [INFO][4845] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.694 [INFO][4845] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.716 [INFO][4845] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:48.905588 containerd[1931]: 2025-09-12 22:08:48.734 [INFO][4845] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.736 [INFO][4845] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" host="ip-172-31-25-121" Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.746 [INFO][4845] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9 Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.764 [INFO][4845] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" host="ip-172-31-25-121" Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.794 [INFO][4845] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.66/26] block=192.168.65.64/26 handle="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" host="ip-172-31-25-121" Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.795 [INFO][4845] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.66/26] handle="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" host="ip-172-31-25-121" Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.795 [INFO][4845] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:48.906030 containerd[1931]: 2025-09-12 22:08:48.796 [INFO][4845] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.66/26] IPv6=[] ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" HandleID="k8s-pod-network.3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Workload="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.909623 containerd[1931]: 2025-09-12 22:08:48.813 [INFO][4790] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0", GenerateName:"whisker-84bc8f6f49-", Namespace:"calico-system", SelfLink:"", UID:"a74c9e07-5bfc-42f4-9aec-0c22d7bd3828", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84bc8f6f49", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"whisker-84bc8f6f49-mc9wd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.65.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali459b3829441", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:48.909623 containerd[1931]: 2025-09-12 22:08:48.816 [INFO][4790] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.66/32] ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.909815 containerd[1931]: 2025-09-12 22:08:48.818 [INFO][4790] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali459b3829441 ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.909815 containerd[1931]: 2025-09-12 22:08:48.838 [INFO][4790] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.909991 containerd[1931]: 2025-09-12 22:08:48.846 [INFO][4790] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0", GenerateName:"whisker-84bc8f6f49-", Namespace:"calico-system", SelfLink:"", UID:"a74c9e07-5bfc-42f4-9aec-0c22d7bd3828", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84bc8f6f49", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9", Pod:"whisker-84bc8f6f49-mc9wd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.65.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali459b3829441", MAC:"2e:3e:e7:96:15:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:48.912178 containerd[1931]: 2025-09-12 22:08:48.885 [INFO][4790] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" Namespace="calico-system" Pod="whisker-84bc8f6f49-mc9wd" WorkloadEndpoint="ip--172--31--25--121-k8s-whisker--84bc8f6f49--mc9wd-eth0" Sep 12 22:08:48.993629 containerd[1931]: time="2025-09-12T22:08:48.993342966Z" level=info msg="connecting to shim 3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9" address="unix:///run/containerd/s/cd7fb35210df0660321f04366d646c42bcd3067f565c2afc775ab66896eda644" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:49.032173 systemd-networkd[1811]: cali87bcb234edf: Link UP Sep 12 22:08:49.036712 systemd-networkd[1811]: cali87bcb234edf: Gained carrier Sep 12 22:08:49.113329 containerd[1931]: 2025-09-12 22:08:48.361 [INFO][4805] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:08:49.113329 containerd[1931]: 2025-09-12 22:08:48.467 [INFO][4805] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0 csi-node-driver- calico-system 746510a2-33f4-43d1-a7e4-66416d6cce62 704 0 2025-09-12 22:08:25 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-25-121 csi-node-driver-npwnh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali87bcb234edf [] [] }} ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-" Sep 12 22:08:49.113329 containerd[1931]: 2025-09-12 22:08:48.467 [INFO][4805] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.113329 containerd[1931]: 2025-09-12 22:08:48.724 [INFO][4855] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" HandleID="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Workload="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.724 [INFO][4855] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" HandleID="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Workload="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400033f6d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-25-121", "pod":"csi-node-driver-npwnh", "timestamp":"2025-09-12 22:08:48.724655945 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.725 [INFO][4855] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.797 [INFO][4855] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.797 [INFO][4855] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.829 [INFO][4855] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" host="ip-172-31-25-121" Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.876 [INFO][4855] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.912 [INFO][4855] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.928 [INFO][4855] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.113674 containerd[1931]: 2025-09-12 22:08:48.935 [INFO][4855] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.937 [INFO][4855] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" host="ip-172-31-25-121" Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.941 [INFO][4855] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.968 [INFO][4855] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" host="ip-172-31-25-121" Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.991 [INFO][4855] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.67/26] block=192.168.65.64/26 handle="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" host="ip-172-31-25-121" Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.991 [INFO][4855] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.67/26] handle="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" host="ip-172-31-25-121" Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.991 [INFO][4855] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:49.115163 containerd[1931]: 2025-09-12 22:08:48.991 [INFO][4855] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.67/26] IPv6=[] ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" HandleID="k8s-pod-network.8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Workload="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.115585 containerd[1931]: 2025-09-12 22:08:49.003 [INFO][4805] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"746510a2-33f4-43d1-a7e4-66416d6cce62", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"csi-node-driver-npwnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.65.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87bcb234edf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.115719 containerd[1931]: 2025-09-12 22:08:49.005 [INFO][4805] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.67/32] ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.115719 containerd[1931]: 2025-09-12 22:08:49.007 [INFO][4805] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali87bcb234edf ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.115719 containerd[1931]: 2025-09-12 22:08:49.042 [INFO][4805] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.115861 containerd[1931]: 2025-09-12 22:08:49.049 [INFO][4805] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"746510a2-33f4-43d1-a7e4-66416d6cce62", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc", Pod:"csi-node-driver-npwnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.65.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87bcb234edf", MAC:"de:50:d9:91:df:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.116873 containerd[1931]: 2025-09-12 22:08:49.103 [INFO][4805] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" Namespace="calico-system" Pod="csi-node-driver-npwnh" WorkloadEndpoint="ip--172--31--25--121-k8s-csi--node--driver--npwnh-eth0" Sep 12 22:08:49.145666 systemd[1]: Started cri-containerd-3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9.scope - libcontainer container 3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9. Sep 12 22:08:49.159541 containerd[1931]: time="2025-09-12T22:08:49.159075507Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-24zd8,Uid:6945a7e6-ff59-498b-a0da-e2121447b792,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:08:49.161755 containerd[1931]: time="2025-09-12T22:08:49.161521347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vl2mt,Uid:73e2ed8d-89f7-42c2-9717-a3369e2dbf0b,Namespace:kube-system,Attempt:0,}" Sep 12 22:08:49.225085 systemd-networkd[1811]: cali7446dc41199: Link UP Sep 12 22:08:49.229504 systemd-networkd[1811]: cali7446dc41199: Gained carrier Sep 12 22:08:49.310744 containerd[1931]: 2025-09-12 22:08:48.579 [INFO][4829] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:08:49.310744 containerd[1931]: 2025-09-12 22:08:48.682 [INFO][4829] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0 calico-apiserver-f6c548746- calico-apiserver 30c5d4d2-8264-4587-95b9-fdd641501e94 812 0 2025-09-12 22:08:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f6c548746 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-25-121 calico-apiserver-f6c548746-mwv4c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7446dc41199 [] [] }} ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-" Sep 12 22:08:49.310744 containerd[1931]: 2025-09-12 22:08:48.682 [INFO][4829] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.310744 containerd[1931]: 2025-09-12 22:08:48.810 [INFO][4873] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" HandleID="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:48.811 [INFO][4873] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" HandleID="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40000f8520), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-25-121", "pod":"calico-apiserver-f6c548746-mwv4c", "timestamp":"2025-09-12 22:08:48.810029441 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:48.811 [INFO][4873] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:48.992 [INFO][4873] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:48.992 [INFO][4873] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:49.049 [INFO][4873] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" host="ip-172-31-25-121" Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:49.083 [INFO][4873] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:49.101 [INFO][4873] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:49.110 [INFO][4873] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.311217 containerd[1931]: 2025-09-12 22:08:49.131 [INFO][4873] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.131 [INFO][4873] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" host="ip-172-31-25-121" Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.140 [INFO][4873] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571 Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.162 [INFO][4873] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" host="ip-172-31-25-121" Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.196 [INFO][4873] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.68/26] block=192.168.65.64/26 handle="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" host="ip-172-31-25-121" Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.197 [INFO][4873] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.68/26] handle="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" host="ip-172-31-25-121" Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.198 [INFO][4873] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:49.311669 containerd[1931]: 2025-09-12 22:08:49.199 [INFO][4873] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.68/26] IPv6=[] ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" HandleID="k8s-pod-network.c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.311994 containerd[1931]: 2025-09-12 22:08:49.211 [INFO][4829] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0", GenerateName:"calico-apiserver-f6c548746-", Namespace:"calico-apiserver", SelfLink:"", UID:"30c5d4d2-8264-4587-95b9-fdd641501e94", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c548746", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"calico-apiserver-f6c548746-mwv4c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7446dc41199", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.312147 containerd[1931]: 2025-09-12 22:08:49.211 [INFO][4829] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.68/32] ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.312147 containerd[1931]: 2025-09-12 22:08:49.211 [INFO][4829] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7446dc41199 ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.312147 containerd[1931]: 2025-09-12 22:08:49.241 [INFO][4829] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.312317 containerd[1931]: 2025-09-12 22:08:49.246 [INFO][4829] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0", GenerateName:"calico-apiserver-f6c548746-", Namespace:"calico-apiserver", SelfLink:"", UID:"30c5d4d2-8264-4587-95b9-fdd641501e94", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c548746", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571", Pod:"calico-apiserver-f6c548746-mwv4c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7446dc41199", MAC:"aa:e3:5a:6a:53:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.312440 containerd[1931]: 2025-09-12 22:08:49.278 [INFO][4829] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-mwv4c" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--mwv4c-eth0" Sep 12 22:08:49.348501 containerd[1931]: time="2025-09-12T22:08:49.348421372Z" level=info msg="connecting to shim 8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc" address="unix:///run/containerd/s/57a25b3c20000973378a0b7b7212b5dc79e3fe36b354fc5449ddfbd7da379e51" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:49.358273 systemd-networkd[1811]: califa67c3b8e59: Gained IPv6LL Sep 12 22:08:49.431587 systemd-networkd[1811]: cali2168ad9bd85: Link UP Sep 12 22:08:49.445671 systemd-networkd[1811]: cali2168ad9bd85: Gained carrier Sep 12 22:08:49.530465 systemd[1]: Started cri-containerd-8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc.scope - libcontainer container 8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc. Sep 12 22:08:49.544276 containerd[1931]: 2025-09-12 22:08:48.519 [INFO][4818] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:08:49.544276 containerd[1931]: 2025-09-12 22:08:48.586 [INFO][4818] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0 coredns-7c65d6cfc9- kube-system 79d77081-eb0e-4e25-a57d-0f78364fbfa6 815 0 2025-09-12 22:08:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-25-121 coredns-7c65d6cfc9-lbgt4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2168ad9bd85 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-" Sep 12 22:08:49.544276 containerd[1931]: 2025-09-12 22:08:48.587 [INFO][4818] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.544276 containerd[1931]: 2025-09-12 22:08:48.920 [INFO][4865] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" HandleID="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:48.921 [INFO][4865] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" HandleID="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103d30), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-25-121", "pod":"coredns-7c65d6cfc9-lbgt4", "timestamp":"2025-09-12 22:08:48.919957962 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:48.922 [INFO][4865] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.198 [INFO][4865] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.199 [INFO][4865] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.252 [INFO][4865] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" host="ip-172-31-25-121" Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.266 [INFO][4865] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.288 [INFO][4865] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.305 [INFO][4865] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.545436 containerd[1931]: 2025-09-12 22:08:49.316 [INFO][4865] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.320 [INFO][4865] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" host="ip-172-31-25-121" Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.334 [INFO][4865] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641 Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.351 [INFO][4865] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" host="ip-172-31-25-121" Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.384 [INFO][4865] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.69/26] block=192.168.65.64/26 handle="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" host="ip-172-31-25-121" Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.388 [INFO][4865] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.69/26] handle="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" host="ip-172-31-25-121" Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.388 [INFO][4865] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:49.545873 containerd[1931]: 2025-09-12 22:08:49.388 [INFO][4865] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.69/26] IPv6=[] ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" HandleID="k8s-pod-network.c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.416 [INFO][4818] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"79d77081-eb0e-4e25-a57d-0f78364fbfa6", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"coredns-7c65d6cfc9-lbgt4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2168ad9bd85", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.416 [INFO][4818] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.69/32] ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.416 [INFO][4818] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2168ad9bd85 ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.445 [INFO][4818] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.449 [INFO][4818] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"79d77081-eb0e-4e25-a57d-0f78364fbfa6", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641", Pod:"coredns-7c65d6cfc9-lbgt4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2168ad9bd85", MAC:"5e:03:fd:cc:91:cb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:49.546237 containerd[1931]: 2025-09-12 22:08:49.494 [INFO][4818] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lbgt4" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--lbgt4-eth0" Sep 12 22:08:49.619062 containerd[1931]: time="2025-09-12T22:08:49.618167885Z" level=info msg="connecting to shim c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571" address="unix:///run/containerd/s/1ac170d00990e94c86f6636a9a29d51cb59090e0d5f5acc07ff49a690fe4b4d6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:49.642649 containerd[1931]: time="2025-09-12T22:08:49.642575309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84bc8f6f49-mc9wd,Uid:a74c9e07-5bfc-42f4-9aec-0c22d7bd3828,Namespace:calico-system,Attempt:0,} returns sandbox id \"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9\"" Sep 12 22:08:49.758075 containerd[1931]: time="2025-09-12T22:08:49.757965246Z" level=info msg="connecting to shim c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641" address="unix:///run/containerd/s/1d2fa4af6bcaefee9b0f96af2383c3922fd78e58c12ae79cd0867a92fa72db3d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:49.844715 systemd[1]: Started cri-containerd-c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571.scope - libcontainer container c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571. Sep 12 22:08:49.900139 containerd[1931]: time="2025-09-12T22:08:49.899683183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-npwnh,Uid:746510a2-33f4-43d1-a7e4-66416d6cce62,Namespace:calico-system,Attempt:0,} returns sandbox id \"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc\"" Sep 12 22:08:50.057650 systemd[1]: Started cri-containerd-c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641.scope - libcontainer container c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641. Sep 12 22:08:50.160146 containerd[1931]: time="2025-09-12T22:08:50.158022580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c6947c97-2lbtl,Uid:39f9ed23-d9e9-4acb-bfad-64b584e9c682,Namespace:calico-system,Attempt:0,}" Sep 12 22:08:50.260440 containerd[1931]: time="2025-09-12T22:08:50.260371349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-mwv4c,Uid:30c5d4d2-8264-4587-95b9-fdd641501e94,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571\"" Sep 12 22:08:50.314746 systemd-networkd[1811]: cali5be48fe4a19: Link UP Sep 12 22:08:50.319444 systemd-networkd[1811]: cali5be48fe4a19: Gained carrier Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:49.847 [INFO][5038] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0 calico-apiserver-f6c548746- calico-apiserver 6945a7e6-ff59-498b-a0da-e2121447b792 818 0 2025-09-12 22:08:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f6c548746 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-25-121 calico-apiserver-f6c548746-24zd8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5be48fe4a19 [] [] }} ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:49.851 [INFO][5038] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.066 [INFO][5111] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" HandleID="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.067 [INFO][5111] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" HandleID="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400038eb00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-25-121", "pod":"calico-apiserver-f6c548746-24zd8", "timestamp":"2025-09-12 22:08:50.06677944 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.068 [INFO][5111] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.069 [INFO][5111] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.069 [INFO][5111] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.116 [INFO][5111] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.139 [INFO][5111] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.157 [INFO][5111] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.171 [INFO][5111] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.207 [INFO][5111] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.208 [INFO][5111] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.218 [INFO][5111] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42 Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.239 [INFO][5111] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.263 [INFO][5111] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.70/26] block=192.168.65.64/26 handle="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.263 [INFO][5111] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.70/26] handle="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" host="ip-172-31-25-121" Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.263 [INFO][5111] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:50.411740 containerd[1931]: 2025-09-12 22:08:50.263 [INFO][5111] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.70/26] IPv6=[] ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" HandleID="k8s-pod-network.a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Workload="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.290 [INFO][5038] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0", GenerateName:"calico-apiserver-f6c548746-", Namespace:"calico-apiserver", SelfLink:"", UID:"6945a7e6-ff59-498b-a0da-e2121447b792", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c548746", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"calico-apiserver-f6c548746-24zd8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5be48fe4a19", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.294 [INFO][5038] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.70/32] ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.296 [INFO][5038] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5be48fe4a19 ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.318 [INFO][5038] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.320 [INFO][5038] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0", GenerateName:"calico-apiserver-f6c548746-", Namespace:"calico-apiserver", SelfLink:"", UID:"6945a7e6-ff59-498b-a0da-e2121447b792", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c548746", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42", Pod:"calico-apiserver-f6c548746-24zd8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.65.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5be48fe4a19", MAC:"32:c4:8f:06:b8:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:50.414327 containerd[1931]: 2025-09-12 22:08:50.360 [INFO][5038] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" Namespace="calico-apiserver" Pod="calico-apiserver-f6c548746-24zd8" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--apiserver--f6c548746--24zd8-eth0" Sep 12 22:08:50.505836 containerd[1931]: time="2025-09-12T22:08:50.505781130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lbgt4,Uid:79d77081-eb0e-4e25-a57d-0f78364fbfa6,Namespace:kube-system,Attempt:0,} returns sandbox id \"c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641\"" Sep 12 22:08:50.528097 containerd[1931]: time="2025-09-12T22:08:50.526760826Z" level=info msg="CreateContainer within sandbox \"c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:08:50.579284 containerd[1931]: time="2025-09-12T22:08:50.576769206Z" level=info msg="connecting to shim a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42" address="unix:///run/containerd/s/bcf13632bd065c19b429dc1e71d68e6c08a10ba2af88c97ea559404c47f123ba" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:50.598153 containerd[1931]: time="2025-09-12T22:08:50.592464222Z" level=info msg="Container f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:50.636338 systemd-networkd[1811]: cali2168ad9bd85: Gained IPv6LL Sep 12 22:08:50.643808 containerd[1931]: time="2025-09-12T22:08:50.643215450Z" level=info msg="CreateContainer within sandbox \"c9a96440eb294de681f878258d0b6edf892b0efd3a8c8049c0f1f7317ccd6641\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a\"" Sep 12 22:08:50.652336 containerd[1931]: time="2025-09-12T22:08:50.652259622Z" level=info msg="StartContainer for \"f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a\"" Sep 12 22:08:50.688014 containerd[1931]: time="2025-09-12T22:08:50.687941503Z" level=info msg="connecting to shim f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a" address="unix:///run/containerd/s/1d2fa4af6bcaefee9b0f96af2383c3922fd78e58c12ae79cd0867a92fa72db3d" protocol=ttrpc version=3 Sep 12 22:08:50.758671 systemd[1]: Started cri-containerd-a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42.scope - libcontainer container a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42. Sep 12 22:08:50.786800 systemd-networkd[1811]: cali4dc57649482: Link UP Sep 12 22:08:50.791967 systemd-networkd[1811]: cali4dc57649482: Gained carrier Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.050 [INFO][5034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0 coredns-7c65d6cfc9- kube-system 73e2ed8d-89f7-42c2-9717-a3369e2dbf0b 808 0 2025-09-12 22:08:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-25-121 coredns-7c65d6cfc9-vl2mt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4dc57649482 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.053 [INFO][5034] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.473 [INFO][5148] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" HandleID="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.478 [INFO][5148] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" HandleID="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001024a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-25-121", "pod":"coredns-7c65d6cfc9-vl2mt", "timestamp":"2025-09-12 22:08:50.47274693 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.479 [INFO][5148] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.479 [INFO][5148] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.481 [INFO][5148] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.539 [INFO][5148] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.582 [INFO][5148] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.621 [INFO][5148] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.630 [INFO][5148] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.648 [INFO][5148] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.651 [INFO][5148] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.658 [INFO][5148] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688 Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.679 [INFO][5148] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.722 [INFO][5148] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.71/26] block=192.168.65.64/26 handle="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.722 [INFO][5148] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.71/26] handle="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" host="ip-172-31-25-121" Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.722 [INFO][5148] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:50.864081 containerd[1931]: 2025-09-12 22:08:50.722 [INFO][5148] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.71/26] IPv6=[] ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" HandleID="k8s-pod-network.aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Workload="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.759 [INFO][5034] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"73e2ed8d-89f7-42c2-9717-a3369e2dbf0b", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"coredns-7c65d6cfc9-vl2mt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4dc57649482", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.764 [INFO][5034] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.71/32] ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.764 [INFO][5034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4dc57649482 ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.799 [INFO][5034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.804 [INFO][5034] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"73e2ed8d-89f7-42c2-9717-a3369e2dbf0b", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688", Pod:"coredns-7c65d6cfc9-vl2mt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.65.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4dc57649482", MAC:"2a:e3:ba:2c:60:ce", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:50.868056 containerd[1931]: 2025-09-12 22:08:50.851 [INFO][5034] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vl2mt" WorkloadEndpoint="ip--172--31--25--121-k8s-coredns--7c65d6cfc9--vl2mt-eth0" Sep 12 22:08:50.892582 systemd-networkd[1811]: cali459b3829441: Gained IPv6LL Sep 12 22:08:50.897499 systemd[1]: Started cri-containerd-f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a.scope - libcontainer container f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a. Sep 12 22:08:50.956297 systemd-networkd[1811]: cali87bcb234edf: Gained IPv6LL Sep 12 22:08:51.068777 containerd[1931]: time="2025-09-12T22:08:51.068687189Z" level=info msg="connecting to shim aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688" address="unix:///run/containerd/s/1c7d2b869bb0b46a6270e5bfb792768c17dc72c03619493ad65be8ca9053b3a9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:51.120271 containerd[1931]: time="2025-09-12T22:08:51.119547137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c548746-24zd8,Uid:6945a7e6-ff59-498b-a0da-e2121447b792,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42\"" Sep 12 22:08:51.147925 containerd[1931]: time="2025-09-12T22:08:51.147842801Z" level=info msg="StartContainer for \"f0b914143d4698cf21c11821b55de7cf21751f3a765d920a34e307da2aaa371a\" returns successfully" Sep 12 22:08:51.212624 systemd-networkd[1811]: cali7446dc41199: Gained IPv6LL Sep 12 22:08:51.237603 systemd[1]: Started cri-containerd-aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688.scope - libcontainer container aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688. Sep 12 22:08:51.264557 systemd-networkd[1811]: cali5094d884387: Link UP Sep 12 22:08:51.278735 systemd-networkd[1811]: cali5094d884387: Gained carrier Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:50.664 [INFO][5160] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0 calico-kube-controllers-68c6947c97- calico-system 39f9ed23-d9e9-4acb-bfad-64b584e9c682 816 0 2025-09-12 22:08:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68c6947c97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-25-121 calico-kube-controllers-68c6947c97-2lbtl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5094d884387 [] [] }} ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:50.664 [INFO][5160] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.020 [INFO][5223] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" HandleID="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Workload="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.027 [INFO][5223] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" HandleID="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Workload="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028f2d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-25-121", "pod":"calico-kube-controllers-68c6947c97-2lbtl", "timestamp":"2025-09-12 22:08:51.020550424 +0000 UTC"}, Hostname:"ip-172-31-25-121", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.029 [INFO][5223] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.029 [INFO][5223] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.030 [INFO][5223] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-25-121' Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.084 [INFO][5223] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.119 [INFO][5223] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.143 [INFO][5223] ipam/ipam.go 511: Trying affinity for 192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.153 [INFO][5223] ipam/ipam.go 158: Attempting to load block cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.165 [INFO][5223] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.65.64/26 host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.165 [INFO][5223] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.65.64/26 handle="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.175 [INFO][5223] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23 Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.212 [INFO][5223] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.65.64/26 handle="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.246 [INFO][5223] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.65.72/26] block=192.168.65.64/26 handle="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.246 [INFO][5223] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.65.72/26] handle="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" host="ip-172-31-25-121" Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.248 [INFO][5223] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:08:51.347022 containerd[1931]: 2025-09-12 22:08:51.248 [INFO][5223] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.65.72/26] IPv6=[] ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" HandleID="k8s-pod-network.65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Workload="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.255 [INFO][5160] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0", GenerateName:"calico-kube-controllers-68c6947c97-", Namespace:"calico-system", SelfLink:"", UID:"39f9ed23-d9e9-4acb-bfad-64b584e9c682", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68c6947c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"", Pod:"calico-kube-controllers-68c6947c97-2lbtl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.65.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5094d884387", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.255 [INFO][5160] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.65.72/32] ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.255 [INFO][5160] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5094d884387 ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.284 [INFO][5160] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.301 [INFO][5160] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0", GenerateName:"calico-kube-controllers-68c6947c97-", Namespace:"calico-system", SelfLink:"", UID:"39f9ed23-d9e9-4acb-bfad-64b584e9c682", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 8, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68c6947c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-25-121", ContainerID:"65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23", Pod:"calico-kube-controllers-68c6947c97-2lbtl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.65.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5094d884387", MAC:"ba:ec:49:72:ce:05", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:08:51.350852 containerd[1931]: 2025-09-12 22:08:51.334 [INFO][5160] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" Namespace="calico-system" Pod="calico-kube-controllers-68c6947c97-2lbtl" WorkloadEndpoint="ip--172--31--25--121-k8s-calico--kube--controllers--68c6947c97--2lbtl-eth0" Sep 12 22:08:51.425752 containerd[1931]: time="2025-09-12T22:08:51.425482302Z" level=info msg="connecting to shim 65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23" address="unix:///run/containerd/s/20e8816ddb5750854835c71bc0a77a78e70a54d26ff12ec7bf8700250556bebd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:08:51.513191 containerd[1931]: time="2025-09-12T22:08:51.512965687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vl2mt,Uid:73e2ed8d-89f7-42c2-9717-a3369e2dbf0b,Namespace:kube-system,Attempt:0,} returns sandbox id \"aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688\"" Sep 12 22:08:51.525189 containerd[1931]: time="2025-09-12T22:08:51.525069535Z" level=info msg="CreateContainer within sandbox \"aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:08:51.532298 systemd-networkd[1811]: cali5be48fe4a19: Gained IPv6LL Sep 12 22:08:51.587524 systemd[1]: Started cri-containerd-65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23.scope - libcontainer container 65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23. Sep 12 22:08:51.617996 containerd[1931]: time="2025-09-12T22:08:51.617910523Z" level=info msg="Container 174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:51.628654 kubelet[3235]: I0912 22:08:51.628562 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-lbgt4" podStartSLOduration=49.628542427 podStartE2EDuration="49.628542427s" podCreationTimestamp="2025-09-12 22:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:08:51.625453687 +0000 UTC m=+53.771117152" watchObservedRunningTime="2025-09-12 22:08:51.628542427 +0000 UTC m=+53.774205880" Sep 12 22:08:51.648387 containerd[1931]: time="2025-09-12T22:08:51.648284251Z" level=info msg="CreateContainer within sandbox \"aed7d60f44131f9d22df0a70d5fb35c6b6b40d65ef443027f89861cd92edd688\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d\"" Sep 12 22:08:51.649236 containerd[1931]: time="2025-09-12T22:08:51.649162759Z" level=info msg="StartContainer for \"174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d\"" Sep 12 22:08:51.655896 containerd[1931]: time="2025-09-12T22:08:51.655793623Z" level=info msg="connecting to shim 174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d" address="unix:///run/containerd/s/1c7d2b869bb0b46a6270e5bfb792768c17dc72c03619493ad65be8ca9053b3a9" protocol=ttrpc version=3 Sep 12 22:08:52.014579 systemd[1]: Started cri-containerd-174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d.scope - libcontainer container 174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d. Sep 12 22:08:52.093593 systemd-networkd[1811]: vxlan.calico: Link UP Sep 12 22:08:52.093606 systemd-networkd[1811]: vxlan.calico: Gained carrier Sep 12 22:08:52.144134 containerd[1931]: time="2025-09-12T22:08:52.144053670Z" level=info msg="StartContainer for \"174bfe3507a2ea1b259f156cf140ed6359f20effcd4a23c80fc30c2ee4bb636d\" returns successfully" Sep 12 22:08:52.300421 systemd-networkd[1811]: cali5094d884387: Gained IPv6LL Sep 12 22:08:52.301074 systemd-networkd[1811]: cali4dc57649482: Gained IPv6LL Sep 12 22:08:52.521569 systemd[1]: Started sshd@7-172.31.25.121:22-139.178.89.65:45722.service - OpenSSH per-connection server daemon (139.178.89.65:45722). Sep 12 22:08:52.536998 (udev-worker)[4575]: Network interface NamePolicy= disabled on kernel command line. Sep 12 22:08:52.684286 kubelet[3235]: I0912 22:08:52.683701 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-vl2mt" podStartSLOduration=50.683675661 podStartE2EDuration="50.683675661s" podCreationTimestamp="2025-09-12 22:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:08:52.680284437 +0000 UTC m=+54.825947914" watchObservedRunningTime="2025-09-12 22:08:52.683675661 +0000 UTC m=+54.829339126" Sep 12 22:08:52.794126 containerd[1931]: time="2025-09-12T22:08:52.792557757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c6947c97-2lbtl,Uid:39f9ed23-d9e9-4acb-bfad-64b584e9c682,Namespace:calico-system,Attempt:0,} returns sandbox id \"65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23\"" Sep 12 22:08:52.836205 sshd[5445]: Accepted publickey for core from 139.178.89.65 port 45722 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:08:52.839797 sshd-session[5445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:52.857197 systemd-logind[1862]: New session 8 of user core. Sep 12 22:08:52.860767 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:08:53.277100 sshd[5462]: Connection closed by 139.178.89.65 port 45722 Sep 12 22:08:53.277897 sshd-session[5445]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:53.292171 systemd[1]: sshd@7-172.31.25.121:22-139.178.89.65:45722.service: Deactivated successfully. Sep 12 22:08:53.299812 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:08:53.312073 systemd-logind[1862]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:08:53.314269 systemd-logind[1862]: Removed session 8. Sep 12 22:08:53.452289 systemd-networkd[1811]: vxlan.calico: Gained IPv6LL Sep 12 22:08:54.605007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1565110490.mount: Deactivated successfully. Sep 12 22:08:55.519791 containerd[1931]: time="2025-09-12T22:08:55.518410571Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:55.520578 containerd[1931]: time="2025-09-12T22:08:55.520533731Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 22:08:55.523246 containerd[1931]: time="2025-09-12T22:08:55.523183475Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:55.543737 containerd[1931]: time="2025-09-12T22:08:55.543653939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:55.545860 containerd[1931]: time="2025-09-12T22:08:55.545814551Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 7.481551189s" Sep 12 22:08:55.546167 containerd[1931]: time="2025-09-12T22:08:55.546001367Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 22:08:55.551196 containerd[1931]: time="2025-09-12T22:08:55.550730219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:08:55.553566 containerd[1931]: time="2025-09-12T22:08:55.553491707Z" level=info msg="CreateContainer within sandbox \"763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:08:55.572170 containerd[1931]: time="2025-09-12T22:08:55.571472903Z" level=info msg="Container bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:55.594403 containerd[1931]: time="2025-09-12T22:08:55.594329939Z" level=info msg="CreateContainer within sandbox \"763b640fe1ccfef35493d199aa7bbf8231ab1888eab5fb92967a82bd189373c5\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\"" Sep 12 22:08:55.596387 containerd[1931]: time="2025-09-12T22:08:55.596286491Z" level=info msg="StartContainer for \"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\"" Sep 12 22:08:55.600452 containerd[1931]: time="2025-09-12T22:08:55.600373463Z" level=info msg="connecting to shim bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c" address="unix:///run/containerd/s/0293489615e03a104e0c527129ba982946c719c305238a3bdc7500bb6cffb457" protocol=ttrpc version=3 Sep 12 22:08:55.648451 systemd[1]: Started cri-containerd-bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c.scope - libcontainer container bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c. Sep 12 22:08:55.731328 containerd[1931]: time="2025-09-12T22:08:55.731147760Z" level=info msg="StartContainer for \"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" returns successfully" Sep 12 22:08:56.495247 ntpd[2130]: Listen normally on 6 vxlan.calico 192.168.65.64:123 Sep 12 22:08:56.495372 ntpd[2130]: Listen normally on 7 califa67c3b8e59 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 6 vxlan.calico 192.168.65.64:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 7 califa67c3b8e59 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 8 cali459b3829441 [fe80::ecee:eeff:feee:eeee%5]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 9 cali87bcb234edf [fe80::ecee:eeff:feee:eeee%6]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 10 cali7446dc41199 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 11 cali2168ad9bd85 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 12 cali5be48fe4a19 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 13 cali4dc57649482 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 14 cali5094d884387 [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 22:08:56.495824 ntpd[2130]: 12 Sep 22:08:56 ntpd[2130]: Listen normally on 15 vxlan.calico [fe80::6479:ddff:fea1:9dbd%12]:123 Sep 12 22:08:56.495421 ntpd[2130]: Listen normally on 8 cali459b3829441 [fe80::ecee:eeff:feee:eeee%5]:123 Sep 12 22:08:56.495466 ntpd[2130]: Listen normally on 9 cali87bcb234edf [fe80::ecee:eeff:feee:eeee%6]:123 Sep 12 22:08:56.495509 ntpd[2130]: Listen normally on 10 cali7446dc41199 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 22:08:56.495552 ntpd[2130]: Listen normally on 11 cali2168ad9bd85 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 22:08:56.495596 ntpd[2130]: Listen normally on 12 cali5be48fe4a19 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 22:08:56.495645 ntpd[2130]: Listen normally on 13 cali4dc57649482 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 22:08:56.495688 ntpd[2130]: Listen normally on 14 cali5094d884387 [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 22:08:56.495736 ntpd[2130]: Listen normally on 15 vxlan.calico [fe80::6479:ddff:fea1:9dbd%12]:123 Sep 12 22:08:56.993995 containerd[1931]: time="2025-09-12T22:08:56.993814190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"b95bba8b4c260d89fbc1b66ca7c9854098f2437ff2f80316c0b9557bf4e6a7e7\" pid:5577 exit_status:1 exited_at:{seconds:1757714936 nanos:992683226}" Sep 12 22:08:57.057167 containerd[1931]: time="2025-09-12T22:08:57.056805046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:57.060479 containerd[1931]: time="2025-09-12T22:08:57.060403534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 22:08:57.063399 containerd[1931]: time="2025-09-12T22:08:57.063323590Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:57.069138 containerd[1931]: time="2025-09-12T22:08:57.068813290Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:57.071250 containerd[1931]: time="2025-09-12T22:08:57.071177530Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.520386879s" Sep 12 22:08:57.071466 containerd[1931]: time="2025-09-12T22:08:57.071396710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 22:08:57.073516 containerd[1931]: time="2025-09-12T22:08:57.073385134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:08:57.084143 containerd[1931]: time="2025-09-12T22:08:57.084043762Z" level=info msg="CreateContainer within sandbox \"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:08:57.108144 containerd[1931]: time="2025-09-12T22:08:57.104762099Z" level=info msg="Container 55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:57.123584 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount332512087.mount: Deactivated successfully. Sep 12 22:08:57.133069 containerd[1931]: time="2025-09-12T22:08:57.132971231Z" level=info msg="CreateContainer within sandbox \"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57\"" Sep 12 22:08:57.135200 containerd[1931]: time="2025-09-12T22:08:57.135145715Z" level=info msg="StartContainer for \"55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57\"" Sep 12 22:08:57.139833 containerd[1931]: time="2025-09-12T22:08:57.139760963Z" level=info msg="connecting to shim 55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57" address="unix:///run/containerd/s/cd7fb35210df0660321f04366d646c42bcd3067f565c2afc775ab66896eda644" protocol=ttrpc version=3 Sep 12 22:08:57.193410 systemd[1]: Started cri-containerd-55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57.scope - libcontainer container 55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57. Sep 12 22:08:57.272910 containerd[1931]: time="2025-09-12T22:08:57.272656595Z" level=info msg="StartContainer for \"55854c2f1d424ffee97551c0c4253a9f7fb868c109d115a8f1fb28eb3e73ac57\" returns successfully" Sep 12 22:08:57.800680 containerd[1931]: time="2025-09-12T22:08:57.800546738Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"cba88f801c78a1ad54db1e7fda4c8bb785d64b1bc516fcb5785ccaa49d622593\" pid:5638 exit_status:1 exited_at:{seconds:1757714937 nanos:800196146}" Sep 12 22:08:58.327090 systemd[1]: Started sshd@8-172.31.25.121:22-139.178.89.65:45736.service - OpenSSH per-connection server daemon (139.178.89.65:45736). Sep 12 22:08:58.555053 sshd[5652]: Accepted publickey for core from 139.178.89.65 port 45736 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:08:58.560713 sshd-session[5652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:58.573605 systemd-logind[1862]: New session 9 of user core. Sep 12 22:08:58.579515 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:08:58.661505 containerd[1931]: time="2025-09-12T22:08:58.661433786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:58.664926 containerd[1931]: time="2025-09-12T22:08:58.663735782Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 22:08:58.666399 containerd[1931]: time="2025-09-12T22:08:58.666330434Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:58.676149 containerd[1931]: time="2025-09-12T22:08:58.676038710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.602314864s" Sep 12 22:08:58.676623 containerd[1931]: time="2025-09-12T22:08:58.676564286Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:08:58.677235 containerd[1931]: time="2025-09-12T22:08:58.677175266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 22:08:58.680634 containerd[1931]: time="2025-09-12T22:08:58.680548970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:08:58.683671 containerd[1931]: time="2025-09-12T22:08:58.683510858Z" level=info msg="CreateContainer within sandbox \"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:08:58.718224 containerd[1931]: time="2025-09-12T22:08:58.714353283Z" level=info msg="Container 990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:08:58.748183 containerd[1931]: time="2025-09-12T22:08:58.748020987Z" level=info msg="CreateContainer within sandbox \"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8\"" Sep 12 22:08:58.749132 containerd[1931]: time="2025-09-12T22:08:58.749058579Z" level=info msg="StartContainer for \"990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8\"" Sep 12 22:08:58.772352 containerd[1931]: time="2025-09-12T22:08:58.772285203Z" level=info msg="connecting to shim 990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8" address="unix:///run/containerd/s/57a25b3c20000973378a0b7b7212b5dc79e3fe36b354fc5449ddfbd7da379e51" protocol=ttrpc version=3 Sep 12 22:08:58.852417 systemd[1]: Started cri-containerd-990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8.scope - libcontainer container 990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8. Sep 12 22:08:58.982360 containerd[1931]: time="2025-09-12T22:08:58.982279372Z" level=info msg="StartContainer for \"990e53c5de913e4cdd5d14eb0b8774f6acd6b070963ff08d613b5a0e362fdec8\" returns successfully" Sep 12 22:08:59.025070 sshd[5659]: Connection closed by 139.178.89.65 port 45736 Sep 12 22:08:59.024930 sshd-session[5652]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:59.033262 systemd[1]: sshd@8-172.31.25.121:22-139.178.89.65:45736.service: Deactivated successfully. Sep 12 22:08:59.041885 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:08:59.046367 systemd-logind[1862]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:08:59.050886 systemd-logind[1862]: Removed session 9. Sep 12 22:09:01.876195 containerd[1931]: time="2025-09-12T22:09:01.875440530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:01.878056 containerd[1931]: time="2025-09-12T22:09:01.878014158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 22:09:01.879352 containerd[1931]: time="2025-09-12T22:09:01.879314850Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:01.884827 containerd[1931]: time="2025-09-12T22:09:01.884727978Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:01.887178 containerd[1931]: time="2025-09-12T22:09:01.887056638Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.206443672s" Sep 12 22:09:01.888043 containerd[1931]: time="2025-09-12T22:09:01.887977458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:09:01.889422 containerd[1931]: time="2025-09-12T22:09:01.889314954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:09:01.894137 containerd[1931]: time="2025-09-12T22:09:01.892481514Z" level=info msg="CreateContainer within sandbox \"c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:09:01.904510 containerd[1931]: time="2025-09-12T22:09:01.904453410Z" level=info msg="Container cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:01.918257 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1394273028.mount: Deactivated successfully. Sep 12 22:09:01.924814 containerd[1931]: time="2025-09-12T22:09:01.924719262Z" level=info msg="CreateContainer within sandbox \"c8a783caceb0d3816255640d00fac48ac491bfff77acd6bfae40b14758e8f571\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89\"" Sep 12 22:09:01.926894 containerd[1931]: time="2025-09-12T22:09:01.926822274Z" level=info msg="StartContainer for \"cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89\"" Sep 12 22:09:01.930515 containerd[1931]: time="2025-09-12T22:09:01.930430135Z" level=info msg="connecting to shim cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89" address="unix:///run/containerd/s/1ac170d00990e94c86f6636a9a29d51cb59090e0d5f5acc07ff49a690fe4b4d6" protocol=ttrpc version=3 Sep 12 22:09:01.974489 systemd[1]: Started cri-containerd-cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89.scope - libcontainer container cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89. Sep 12 22:09:02.066200 containerd[1931]: time="2025-09-12T22:09:02.066105039Z" level=info msg="StartContainer for \"cd376f2c36b507273020e83e823edee24325edc41448f824663e33b2eaf83f89\" returns successfully" Sep 12 22:09:02.193990 containerd[1931]: time="2025-09-12T22:09:02.193044544Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:02.193990 containerd[1931]: time="2025-09-12T22:09:02.193770220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:09:02.202244 containerd[1931]: time="2025-09-12T22:09:02.202165984Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 312.069386ms" Sep 12 22:09:02.202244 containerd[1931]: time="2025-09-12T22:09:02.202240972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:09:02.206421 containerd[1931]: time="2025-09-12T22:09:02.205780888Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:09:02.211478 containerd[1931]: time="2025-09-12T22:09:02.210955492Z" level=info msg="CreateContainer within sandbox \"a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:09:02.224413 containerd[1931]: time="2025-09-12T22:09:02.224357884Z" level=info msg="Container e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:02.248467 containerd[1931]: time="2025-09-12T22:09:02.248391724Z" level=info msg="CreateContainer within sandbox \"a6650fbb70f00d8dcbd1f32072f4bd0331720bf6f906db75822c5cf081605e42\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d\"" Sep 12 22:09:02.250266 containerd[1931]: time="2025-09-12T22:09:02.250204552Z" level=info msg="StartContainer for \"e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d\"" Sep 12 22:09:02.260207 containerd[1931]: time="2025-09-12T22:09:02.260135260Z" level=info msg="connecting to shim e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d" address="unix:///run/containerd/s/bcf13632bd065c19b429dc1e71d68e6c08a10ba2af88c97ea559404c47f123ba" protocol=ttrpc version=3 Sep 12 22:09:02.306452 systemd[1]: Started cri-containerd-e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d.scope - libcontainer container e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d. Sep 12 22:09:02.410014 containerd[1931]: time="2025-09-12T22:09:02.409950521Z" level=info msg="StartContainer for \"e51abe77402cda04ff843fa9732479456a069fc3a7187e449371b46747270f2d\" returns successfully" Sep 12 22:09:02.726289 kubelet[3235]: I0912 22:09:02.724906 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-htbqv" podStartSLOduration=31.237199341 podStartE2EDuration="38.724882998s" podCreationTimestamp="2025-09-12 22:08:24 +0000 UTC" firstStartedPulling="2025-09-12 22:08:48.062494034 +0000 UTC m=+50.208157475" lastFinishedPulling="2025-09-12 22:08:55.550177679 +0000 UTC m=+57.695841132" observedRunningTime="2025-09-12 22:08:56.693898404 +0000 UTC m=+58.839561953" watchObservedRunningTime="2025-09-12 22:09:02.724882998 +0000 UTC m=+64.870546451" Sep 12 22:09:02.760876 kubelet[3235]: I0912 22:09:02.760732 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f6c548746-24zd8" podStartSLOduration=33.685683908 podStartE2EDuration="44.760708003s" podCreationTimestamp="2025-09-12 22:08:18 +0000 UTC" firstStartedPulling="2025-09-12 22:08:51.130564061 +0000 UTC m=+53.276227514" lastFinishedPulling="2025-09-12 22:09:02.205588156 +0000 UTC m=+64.351251609" observedRunningTime="2025-09-12 22:09:02.759390739 +0000 UTC m=+64.905054204" watchObservedRunningTime="2025-09-12 22:09:02.760708003 +0000 UTC m=+64.906371456" Sep 12 22:09:02.763084 kubelet[3235]: I0912 22:09:02.762931 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f6c548746-mwv4c" podStartSLOduration=33.148954818 podStartE2EDuration="44.762908155s" podCreationTimestamp="2025-09-12 22:08:18 +0000 UTC" firstStartedPulling="2025-09-12 22:08:50.275053901 +0000 UTC m=+52.420717354" lastFinishedPulling="2025-09-12 22:09:01.889007238 +0000 UTC m=+64.034670691" observedRunningTime="2025-09-12 22:09:02.72778737 +0000 UTC m=+64.873450847" watchObservedRunningTime="2025-09-12 22:09:02.762908155 +0000 UTC m=+64.908571608" Sep 12 22:09:03.717148 kubelet[3235]: I0912 22:09:03.715332 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:09:04.067672 systemd[1]: Started sshd@9-172.31.25.121:22-139.178.89.65:57064.service - OpenSSH per-connection server daemon (139.178.89.65:57064). Sep 12 22:09:04.345556 sshd[5802]: Accepted publickey for core from 139.178.89.65 port 57064 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:04.352427 sshd-session[5802]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:04.370382 systemd-logind[1862]: New session 10 of user core. Sep 12 22:09:04.380507 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:09:04.720603 kubelet[3235]: I0912 22:09:04.720391 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:09:04.799149 sshd[5809]: Connection closed by 139.178.89.65 port 57064 Sep 12 22:09:04.800384 sshd-session[5802]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:04.813025 systemd[1]: sshd@9-172.31.25.121:22-139.178.89.65:57064.service: Deactivated successfully. Sep 12 22:09:04.824591 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:09:04.829024 systemd-logind[1862]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:09:04.859094 systemd[1]: Started sshd@10-172.31.25.121:22-139.178.89.65:57074.service - OpenSSH per-connection server daemon (139.178.89.65:57074). Sep 12 22:09:04.866194 systemd-logind[1862]: Removed session 10. Sep 12 22:09:05.128173 sshd[5822]: Accepted publickey for core from 139.178.89.65 port 57074 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:05.135779 sshd-session[5822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:05.155189 systemd-logind[1862]: New session 11 of user core. Sep 12 22:09:05.161535 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:09:05.814163 sshd[5825]: Connection closed by 139.178.89.65 port 57074 Sep 12 22:09:05.815035 sshd-session[5822]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:05.828482 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:09:05.830522 systemd[1]: sshd@10-172.31.25.121:22-139.178.89.65:57074.service: Deactivated successfully. Sep 12 22:09:05.878400 systemd-logind[1862]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:09:05.881286 systemd[1]: Started sshd@11-172.31.25.121:22-139.178.89.65:57090.service - OpenSSH per-connection server daemon (139.178.89.65:57090). Sep 12 22:09:05.894507 systemd-logind[1862]: Removed session 11. Sep 12 22:09:06.159235 sshd[5835]: Accepted publickey for core from 139.178.89.65 port 57090 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:06.169908 sshd-session[5835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:06.199505 systemd-logind[1862]: New session 12 of user core. Sep 12 22:09:06.206516 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:09:06.710420 sshd[5850]: Connection closed by 139.178.89.65 port 57090 Sep 12 22:09:06.714457 sshd-session[5835]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:06.733819 systemd[1]: sshd@11-172.31.25.121:22-139.178.89.65:57090.service: Deactivated successfully. Sep 12 22:09:06.743513 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:09:06.750964 systemd-logind[1862]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:09:06.757921 systemd-logind[1862]: Removed session 12. Sep 12 22:09:06.971322 containerd[1931]: time="2025-09-12T22:09:06.970942212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:06.973346 containerd[1931]: time="2025-09-12T22:09:06.973270200Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 22:09:06.978093 containerd[1931]: time="2025-09-12T22:09:06.978038712Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:06.985863 containerd[1931]: time="2025-09-12T22:09:06.985803672Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:06.991180 containerd[1931]: time="2025-09-12T22:09:06.991098600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 4.784326572s" Sep 12 22:09:06.991430 containerd[1931]: time="2025-09-12T22:09:06.991396404Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 22:09:06.999207 containerd[1931]: time="2025-09-12T22:09:06.999089352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:09:07.045213 containerd[1931]: time="2025-09-12T22:09:07.043609016Z" level=info msg="CreateContainer within sandbox \"65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:09:07.057886 containerd[1931]: time="2025-09-12T22:09:07.057825812Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"b5d48bd6fc93af44f6dddd0a73ceb86dedc63aab8ced4f08d398531104f9a8f5\" pid:5852 exited_at:{seconds:1757714947 nanos:56970608}" Sep 12 22:09:07.076161 containerd[1931]: time="2025-09-12T22:09:07.068204108Z" level=info msg="Container 226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:07.078140 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2443997047.mount: Deactivated successfully. Sep 12 22:09:07.093052 containerd[1931]: time="2025-09-12T22:09:07.092977148Z" level=info msg="CreateContainer within sandbox \"65572b9fa1e891f50cf0ab35c397dd6df1c91a1e714d511b0aaa7d79fc4b9e23\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\"" Sep 12 22:09:07.095235 containerd[1931]: time="2025-09-12T22:09:07.095157092Z" level=info msg="StartContainer for \"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\"" Sep 12 22:09:07.102615 containerd[1931]: time="2025-09-12T22:09:07.102475760Z" level=info msg="connecting to shim 226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f" address="unix:///run/containerd/s/20e8816ddb5750854835c71bc0a77a78e70a54d26ff12ec7bf8700250556bebd" protocol=ttrpc version=3 Sep 12 22:09:07.173902 systemd[1]: Started cri-containerd-226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f.scope - libcontainer container 226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f. Sep 12 22:09:07.517609 containerd[1931]: time="2025-09-12T22:09:07.517545106Z" level=info msg="StartContainer for \"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\" returns successfully" Sep 12 22:09:07.808610 kubelet[3235]: I0912 22:09:07.807969 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-68c6947c97-2lbtl" podStartSLOduration=28.611147073 podStartE2EDuration="42.807943716s" podCreationTimestamp="2025-09-12 22:08:25 +0000 UTC" firstStartedPulling="2025-09-12 22:08:52.797712621 +0000 UTC m=+54.943376062" lastFinishedPulling="2025-09-12 22:09:06.994509252 +0000 UTC m=+69.140172705" observedRunningTime="2025-09-12 22:09:07.806090316 +0000 UTC m=+69.951753793" watchObservedRunningTime="2025-09-12 22:09:07.807943716 +0000 UTC m=+69.953607169" Sep 12 22:09:07.979365 containerd[1931]: time="2025-09-12T22:09:07.979031077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\" id:\"be312336b268af3e8c80579003d3aa36894f81f81de22133f79283dfc1718b50\" pid:5932 exited_at:{seconds:1757714947 nanos:978528097}" Sep 12 22:09:09.386632 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1842787084.mount: Deactivated successfully. Sep 12 22:09:09.407469 containerd[1931]: time="2025-09-12T22:09:09.407303388Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:09.409869 containerd[1931]: time="2025-09-12T22:09:09.409802856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 22:09:09.410808 containerd[1931]: time="2025-09-12T22:09:09.410734632Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:09.415706 containerd[1931]: time="2025-09-12T22:09:09.415620384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:09.417146 containerd[1931]: time="2025-09-12T22:09:09.416889360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.417522484s" Sep 12 22:09:09.417146 containerd[1931]: time="2025-09-12T22:09:09.416946744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 22:09:09.420176 containerd[1931]: time="2025-09-12T22:09:09.420085008Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:09:09.423456 containerd[1931]: time="2025-09-12T22:09:09.423175116Z" level=info msg="CreateContainer within sandbox \"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:09:09.437633 containerd[1931]: time="2025-09-12T22:09:09.437387340Z" level=info msg="Container febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:09.459422 containerd[1931]: time="2025-09-12T22:09:09.459339792Z" level=info msg="CreateContainer within sandbox \"3721c011eeb8a845bf1cc9560074a7a6c4c02f2d582468a2b5c0472a0640faa9\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4\"" Sep 12 22:09:09.461399 containerd[1931]: time="2025-09-12T22:09:09.461351088Z" level=info msg="StartContainer for \"febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4\"" Sep 12 22:09:09.465544 containerd[1931]: time="2025-09-12T22:09:09.465480732Z" level=info msg="connecting to shim febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4" address="unix:///run/containerd/s/cd7fb35210df0660321f04366d646c42bcd3067f565c2afc775ab66896eda644" protocol=ttrpc version=3 Sep 12 22:09:09.513412 systemd[1]: Started cri-containerd-febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4.scope - libcontainer container febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4. Sep 12 22:09:09.613980 containerd[1931]: time="2025-09-12T22:09:09.613664833Z" level=info msg="StartContainer for \"febed6ab59b1341a483a49718a614a4ca8947e5dfc285baa87ab56af288f21f4\" returns successfully" Sep 12 22:09:11.031757 containerd[1931]: time="2025-09-12T22:09:11.031682052Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:11.035283 containerd[1931]: time="2025-09-12T22:09:11.035212920Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 22:09:11.037384 containerd[1931]: time="2025-09-12T22:09:11.037324476Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:11.042936 containerd[1931]: time="2025-09-12T22:09:11.042845616Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:09:11.044143 containerd[1931]: time="2025-09-12T22:09:11.044067876Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.623807656s" Sep 12 22:09:11.044422 containerd[1931]: time="2025-09-12T22:09:11.044139036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 22:09:11.051099 containerd[1931]: time="2025-09-12T22:09:11.050542176Z" level=info msg="CreateContainer within sandbox \"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:09:11.070244 containerd[1931]: time="2025-09-12T22:09:11.070022988Z" level=info msg="Container 421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:11.091712 containerd[1931]: time="2025-09-12T22:09:11.091651296Z" level=info msg="CreateContainer within sandbox \"8edcf1b33ff41c7a99f16ab879884e268b037f7949d29553d55ca30d0269f9dc\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56\"" Sep 12 22:09:11.092748 containerd[1931]: time="2025-09-12T22:09:11.092649084Z" level=info msg="StartContainer for \"421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56\"" Sep 12 22:09:11.096143 containerd[1931]: time="2025-09-12T22:09:11.095975256Z" level=info msg="connecting to shim 421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56" address="unix:///run/containerd/s/57a25b3c20000973378a0b7b7212b5dc79e3fe36b354fc5449ddfbd7da379e51" protocol=ttrpc version=3 Sep 12 22:09:11.154439 systemd[1]: Started cri-containerd-421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56.scope - libcontainer container 421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56. Sep 12 22:09:11.246641 containerd[1931]: time="2025-09-12T22:09:11.246554533Z" level=info msg="StartContainer for \"421ace403db2c842bbdc38b2a4d6a12bf2254e74ca4dfede18c786fbd7f4be56\" returns successfully" Sep 12 22:09:11.313510 kubelet[3235]: I0912 22:09:11.313338 3235 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:09:11.315180 kubelet[3235]: I0912 22:09:11.314864 3235 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:09:11.752595 systemd[1]: Started sshd@12-172.31.25.121:22-139.178.89.65:58496.service - OpenSSH per-connection server daemon (139.178.89.65:58496). Sep 12 22:09:11.851143 kubelet[3235]: I0912 22:09:11.850891 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-npwnh" podStartSLOduration=25.720680395 podStartE2EDuration="46.850868416s" podCreationTimestamp="2025-09-12 22:08:25 +0000 UTC" firstStartedPulling="2025-09-12 22:08:49.916256347 +0000 UTC m=+52.061919800" lastFinishedPulling="2025-09-12 22:09:11.046444368 +0000 UTC m=+73.192107821" observedRunningTime="2025-09-12 22:09:11.849384484 +0000 UTC m=+73.995048057" watchObservedRunningTime="2025-09-12 22:09:11.850868416 +0000 UTC m=+73.996531905" Sep 12 22:09:11.851778 kubelet[3235]: I0912 22:09:11.851305 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-84bc8f6f49-mc9wd" podStartSLOduration=6.081741906 podStartE2EDuration="25.851287648s" podCreationTimestamp="2025-09-12 22:08:46 +0000 UTC" firstStartedPulling="2025-09-12 22:08:49.649405758 +0000 UTC m=+51.795069211" lastFinishedPulling="2025-09-12 22:09:09.4189515 +0000 UTC m=+71.564614953" observedRunningTime="2025-09-12 22:09:09.798958814 +0000 UTC m=+71.944622267" watchObservedRunningTime="2025-09-12 22:09:11.851287648 +0000 UTC m=+73.996951101" Sep 12 22:09:11.979549 sshd[6028]: Accepted publickey for core from 139.178.89.65 port 58496 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:11.984256 sshd-session[6028]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:11.997462 systemd-logind[1862]: New session 13 of user core. Sep 12 22:09:12.006455 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:09:12.316104 sshd[6031]: Connection closed by 139.178.89.65 port 58496 Sep 12 22:09:12.317286 sshd-session[6028]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:12.327761 systemd-logind[1862]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:09:12.328968 systemd[1]: sshd@12-172.31.25.121:22-139.178.89.65:58496.service: Deactivated successfully. Sep 12 22:09:12.333078 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:09:12.339020 systemd-logind[1862]: Removed session 13. Sep 12 22:09:12.984466 containerd[1931]: time="2025-09-12T22:09:12.984383981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" id:\"8ab47349ba4ab9b60045aff0100defab275b1ad6ac46300b4b4726427179aac7\" pid:6056 exit_status:1 exited_at:{seconds:1757714952 nanos:983527853}" Sep 12 22:09:17.357575 systemd[1]: Started sshd@13-172.31.25.121:22-139.178.89.65:58512.service - OpenSSH per-connection server daemon (139.178.89.65:58512). Sep 12 22:09:17.552426 sshd[6076]: Accepted publickey for core from 139.178.89.65 port 58512 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:17.555532 sshd-session[6076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:17.563841 systemd-logind[1862]: New session 14 of user core. Sep 12 22:09:17.570407 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:09:17.824881 sshd[6079]: Connection closed by 139.178.89.65 port 58512 Sep 12 22:09:17.825924 sshd-session[6076]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:17.833573 systemd[1]: sshd@13-172.31.25.121:22-139.178.89.65:58512.service: Deactivated successfully. Sep 12 22:09:17.839671 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:09:17.843015 systemd-logind[1862]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:09:17.846806 systemd-logind[1862]: Removed session 14. Sep 12 22:09:22.867138 systemd[1]: Started sshd@14-172.31.25.121:22-139.178.89.65:38184.service - OpenSSH per-connection server daemon (139.178.89.65:38184). Sep 12 22:09:23.087663 sshd[6100]: Accepted publickey for core from 139.178.89.65 port 38184 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:23.091925 sshd-session[6100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:23.104691 systemd-logind[1862]: New session 15 of user core. Sep 12 22:09:23.114437 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:09:23.416940 sshd[6103]: Connection closed by 139.178.89.65 port 38184 Sep 12 22:09:23.417478 sshd-session[6100]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:23.428578 systemd[1]: sshd@14-172.31.25.121:22-139.178.89.65:38184.service: Deactivated successfully. Sep 12 22:09:23.438313 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:09:23.442076 systemd-logind[1862]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:09:23.464713 systemd[1]: Started sshd@15-172.31.25.121:22-139.178.89.65:38200.service - OpenSSH per-connection server daemon (139.178.89.65:38200). Sep 12 22:09:23.470391 systemd-logind[1862]: Removed session 15. Sep 12 22:09:23.674681 sshd[6115]: Accepted publickey for core from 139.178.89.65 port 38200 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:23.679017 sshd-session[6115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:23.692036 systemd-logind[1862]: New session 16 of user core. Sep 12 22:09:23.700817 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:09:24.276396 sshd[6118]: Connection closed by 139.178.89.65 port 38200 Sep 12 22:09:24.277949 sshd-session[6115]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:24.284585 systemd[1]: sshd@15-172.31.25.121:22-139.178.89.65:38200.service: Deactivated successfully. Sep 12 22:09:24.289583 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:09:24.291975 systemd-logind[1862]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:09:24.295723 systemd-logind[1862]: Removed session 16. Sep 12 22:09:24.316916 systemd[1]: Started sshd@16-172.31.25.121:22-139.178.89.65:38202.service - OpenSSH per-connection server daemon (139.178.89.65:38202). Sep 12 22:09:24.515159 sshd[6128]: Accepted publickey for core from 139.178.89.65 port 38202 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:24.517778 sshd-session[6128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:24.526671 systemd-logind[1862]: New session 17 of user core. Sep 12 22:09:24.538450 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:09:28.748261 sshd[6131]: Connection closed by 139.178.89.65 port 38202 Sep 12 22:09:28.749490 sshd-session[6128]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:28.760894 systemd[1]: sshd@16-172.31.25.121:22-139.178.89.65:38202.service: Deactivated successfully. Sep 12 22:09:28.772005 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:09:28.774410 systemd[1]: session-17.scope: Consumed 1.122s CPU time, 73.6M memory peak. Sep 12 22:09:28.776567 systemd-logind[1862]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:09:28.810576 systemd[1]: Started sshd@17-172.31.25.121:22-139.178.89.65:38216.service - OpenSSH per-connection server daemon (139.178.89.65:38216). Sep 12 22:09:28.817080 systemd-logind[1862]: Removed session 17. Sep 12 22:09:29.026489 sshd[6148]: Accepted publickey for core from 139.178.89.65 port 38216 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:29.028860 sshd-session[6148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:29.040868 systemd-logind[1862]: New session 18 of user core. Sep 12 22:09:29.050544 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:09:29.768870 sshd[6151]: Connection closed by 139.178.89.65 port 38216 Sep 12 22:09:29.768744 sshd-session[6148]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:29.781431 systemd[1]: sshd@17-172.31.25.121:22-139.178.89.65:38216.service: Deactivated successfully. Sep 12 22:09:29.788303 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:09:29.793506 systemd-logind[1862]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:09:29.825042 systemd[1]: Started sshd@18-172.31.25.121:22-139.178.89.65:38222.service - OpenSSH per-connection server daemon (139.178.89.65:38222). Sep 12 22:09:29.829535 systemd-logind[1862]: Removed session 18. Sep 12 22:09:30.035420 kubelet[3235]: I0912 22:09:30.034734 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:09:30.060216 sshd[6161]: Accepted publickey for core from 139.178.89.65 port 38222 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:30.064722 sshd-session[6161]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:30.081320 systemd-logind[1862]: New session 19 of user core. Sep 12 22:09:30.091828 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:09:30.347181 sshd[6164]: Connection closed by 139.178.89.65 port 38222 Sep 12 22:09:30.348283 sshd-session[6161]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:30.357228 systemd[1]: sshd@18-172.31.25.121:22-139.178.89.65:38222.service: Deactivated successfully. Sep 12 22:09:30.362952 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:09:30.368774 systemd-logind[1862]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:09:30.372284 systemd-logind[1862]: Removed session 19. Sep 12 22:09:30.434001 containerd[1931]: time="2025-09-12T22:09:30.433950272Z" level=info msg="TaskExit event in podsandbox handler container_id:\"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\" id:\"a99f0ff41bea550ad369b0ea54910ced11a05a39b421765db36eed663fdc34b5\" pid:6189 exited_at:{seconds:1757714970 nanos:433547120}" Sep 12 22:09:35.390786 systemd[1]: Started sshd@19-172.31.25.121:22-139.178.89.65:45356.service - OpenSSH per-connection server daemon (139.178.89.65:45356). Sep 12 22:09:35.589785 sshd[6212]: Accepted publickey for core from 139.178.89.65 port 45356 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:35.592753 sshd-session[6212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:35.601272 systemd-logind[1862]: New session 20 of user core. Sep 12 22:09:35.614421 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:09:35.863197 sshd[6215]: Connection closed by 139.178.89.65 port 45356 Sep 12 22:09:35.864228 sshd-session[6212]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:35.869899 systemd[1]: sshd@19-172.31.25.121:22-139.178.89.65:45356.service: Deactivated successfully. Sep 12 22:09:35.873529 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:09:35.878631 systemd-logind[1862]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:09:35.880652 systemd-logind[1862]: Removed session 20. Sep 12 22:09:36.155060 containerd[1931]: time="2025-09-12T22:09:36.154846716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\" id:\"e7baa482ae990c535f6485d91c91cee390b909982d6000da416ef5b8f4ac9ede\" pid:6236 exited_at:{seconds:1757714976 nanos:149550804}" Sep 12 22:09:36.272370 containerd[1931]: time="2025-09-12T22:09:36.272282497Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"727475df19d6b916c4f1a92292b87c7d94407df6900e9e701839f6f49b3d0cf3\" pid:6257 exited_at:{seconds:1757714976 nanos:271064005}" Sep 12 22:09:39.212446 containerd[1931]: time="2025-09-12T22:09:39.212389120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"ded924545bb8db4e0e5384f64028fd876186cb64664c9279a23647f5f97b658c\" pid:6284 exited_at:{seconds:1757714979 nanos:211675336}" Sep 12 22:09:40.904542 systemd[1]: Started sshd@20-172.31.25.121:22-139.178.89.65:43312.service - OpenSSH per-connection server daemon (139.178.89.65:43312). Sep 12 22:09:41.109979 sshd[6295]: Accepted publickey for core from 139.178.89.65 port 43312 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:41.112556 sshd-session[6295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:41.122889 systemd-logind[1862]: New session 21 of user core. Sep 12 22:09:41.130368 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 22:09:41.383194 sshd[6298]: Connection closed by 139.178.89.65 port 43312 Sep 12 22:09:41.383045 sshd-session[6295]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:41.389707 systemd[1]: sshd@20-172.31.25.121:22-139.178.89.65:43312.service: Deactivated successfully. Sep 12 22:09:41.390698 systemd-logind[1862]: Session 21 logged out. Waiting for processes to exit. Sep 12 22:09:41.395623 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 22:09:41.400720 systemd-logind[1862]: Removed session 21. Sep 12 22:09:42.957840 containerd[1931]: time="2025-09-12T22:09:42.957772330Z" level=info msg="TaskExit event in podsandbox handler container_id:\"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" id:\"fcb3b743550fd01168c3ecd8b4790b557f5356893b2012ffe9426633393c6842\" pid:6321 exited_at:{seconds:1757714982 nanos:956843002}" Sep 12 22:09:46.424234 systemd[1]: Started sshd@21-172.31.25.121:22-139.178.89.65:43326.service - OpenSSH per-connection server daemon (139.178.89.65:43326). Sep 12 22:09:46.624141 sshd[6333]: Accepted publickey for core from 139.178.89.65 port 43326 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:46.629311 sshd-session[6333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:46.642461 systemd-logind[1862]: New session 22 of user core. Sep 12 22:09:46.648578 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 22:09:46.953183 sshd[6336]: Connection closed by 139.178.89.65 port 43326 Sep 12 22:09:46.953882 sshd-session[6333]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:46.967038 systemd-logind[1862]: Session 22 logged out. Waiting for processes to exit. Sep 12 22:09:46.968413 systemd[1]: sshd@21-172.31.25.121:22-139.178.89.65:43326.service: Deactivated successfully. Sep 12 22:09:46.978936 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 22:09:46.988420 systemd-logind[1862]: Removed session 22. Sep 12 22:09:52.002795 systemd[1]: Started sshd@22-172.31.25.121:22-139.178.89.65:35532.service - OpenSSH per-connection server daemon (139.178.89.65:35532). Sep 12 22:09:52.214292 sshd[6349]: Accepted publickey for core from 139.178.89.65 port 35532 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:52.217205 sshd-session[6349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:52.229984 systemd-logind[1862]: New session 23 of user core. Sep 12 22:09:52.236670 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 22:09:52.560967 sshd[6352]: Connection closed by 139.178.89.65 port 35532 Sep 12 22:09:52.560624 sshd-session[6349]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:52.569170 systemd[1]: sshd@22-172.31.25.121:22-139.178.89.65:35532.service: Deactivated successfully. Sep 12 22:09:52.570352 systemd-logind[1862]: Session 23 logged out. Waiting for processes to exit. Sep 12 22:09:52.576624 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 22:09:52.585533 systemd-logind[1862]: Removed session 23. Sep 12 22:09:57.605509 systemd[1]: Started sshd@23-172.31.25.121:22-139.178.89.65:35542.service - OpenSSH per-connection server daemon (139.178.89.65:35542). Sep 12 22:09:57.818163 sshd[6364]: Accepted publickey for core from 139.178.89.65 port 35542 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:09:57.820812 sshd-session[6364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:57.833636 systemd-logind[1862]: New session 24 of user core. Sep 12 22:09:57.843188 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 22:09:58.178000 sshd[6367]: Connection closed by 139.178.89.65 port 35542 Sep 12 22:09:58.177098 sshd-session[6364]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:58.184954 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 22:09:58.191704 systemd[1]: sshd@23-172.31.25.121:22-139.178.89.65:35542.service: Deactivated successfully. Sep 12 22:09:58.203904 systemd-logind[1862]: Session 24 logged out. Waiting for processes to exit. Sep 12 22:09:58.206761 systemd-logind[1862]: Removed session 24. Sep 12 22:10:03.224833 systemd[1]: Started sshd@24-172.31.25.121:22-139.178.89.65:44154.service - OpenSSH per-connection server daemon (139.178.89.65:44154). Sep 12 22:10:03.456149 sshd[6382]: Accepted publickey for core from 139.178.89.65 port 44154 ssh2: RSA SHA256:5WHlAbubuGgA7Q2ksk9TReQsi4rANDz2jICIbuRZ1E4 Sep 12 22:10:03.459440 sshd-session[6382]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:03.470764 systemd-logind[1862]: New session 25 of user core. Sep 12 22:10:03.479420 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 22:10:03.761406 sshd[6387]: Connection closed by 139.178.89.65 port 44154 Sep 12 22:10:03.762624 sshd-session[6382]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:03.772253 systemd-logind[1862]: Session 25 logged out. Waiting for processes to exit. Sep 12 22:10:03.775851 systemd[1]: sshd@24-172.31.25.121:22-139.178.89.65:44154.service: Deactivated successfully. Sep 12 22:10:03.782325 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 22:10:03.787811 systemd-logind[1862]: Removed session 25. Sep 12 22:10:06.175948 containerd[1931]: time="2025-09-12T22:10:06.175871370Z" level=info msg="TaskExit event in podsandbox handler container_id:\"226e2e7740301078e7ffc163513ef08389f6478e27588bf6c04c9ecc8bc2590f\" id:\"363c4e1ac2d60ece12ee99ba85a2eb83efedb60a8ee22f0cf85ec723c4228bf6\" pid:6411 exited_at:{seconds:1757715006 nanos:175160826}" Sep 12 22:10:06.377138 containerd[1931]: time="2025-09-12T22:10:06.377005915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb80e8f825e9c5ed3de0315bb9396e97b4158d02397191475b1dea42a87b726c\" id:\"a878ac89e53789ce7a5823a22822420d90eec4fb3237bced0d860e6771292d7f\" pid:6432 exited_at:{seconds:1757715006 nanos:376527643}" Sep 12 22:10:12.941479 containerd[1931]: time="2025-09-12T22:10:12.941139219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"047c3012e98a295a31b96022eeaf3fb2de10c5df5e52d84d1b1534d926fb4b07\" id:\"5e60445daa88db386bad2e90313b9aae5eddf14f92b092ee2675918244136089\" pid:6457 exited_at:{seconds:1757715012 nanos:939668667}" Sep 12 22:10:17.543466 systemd[1]: cri-containerd-cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6.scope: Deactivated successfully. Sep 12 22:10:17.544032 systemd[1]: cri-containerd-cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6.scope: Consumed 4.781s CPU time, 62.8M memory peak, 128K read from disk. Sep 12 22:10:17.556279 containerd[1931]: time="2025-09-12T22:10:17.556209306Z" level=info msg="received exit event container_id:\"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\" id:\"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\" pid:3080 exit_status:1 exited_at:{seconds:1757715017 nanos:555403374}" Sep 12 22:10:17.558256 containerd[1931]: time="2025-09-12T22:10:17.557438826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\" id:\"cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6\" pid:3080 exit_status:1 exited_at:{seconds:1757715017 nanos:555403374}" Sep 12 22:10:17.620978 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6-rootfs.mount: Deactivated successfully. Sep 12 22:10:18.064946 kubelet[3235]: I0912 22:10:18.064882 3235 scope.go:117] "RemoveContainer" containerID="cc70c8b6134b67c7c5e7057b525c59b1b625b09af874ba592bc44bcf6f9fe2d6" Sep 12 22:10:18.069126 containerd[1931]: time="2025-09-12T22:10:18.069050585Z" level=info msg="CreateContainer within sandbox \"b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 22:10:18.093959 containerd[1931]: time="2025-09-12T22:10:18.093892325Z" level=info msg="Container 5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:10:18.116702 containerd[1931]: time="2025-09-12T22:10:18.116640917Z" level=info msg="CreateContainer within sandbox \"b0553b04fda66e8e678dcffdf01d83402724e6391275702f9388d25ef9b60a9c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5\"" Sep 12 22:10:18.118143 containerd[1931]: time="2025-09-12T22:10:18.117544529Z" level=info msg="StartContainer for \"5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5\"" Sep 12 22:10:18.119899 containerd[1931]: time="2025-09-12T22:10:18.119806613Z" level=info msg="connecting to shim 5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5" address="unix:///run/containerd/s/47ec793ac8680dcac9feb0343f991dff5fa7204b5912548efc7b9e40270c6999" protocol=ttrpc version=3 Sep 12 22:10:18.164704 systemd[1]: Started cri-containerd-5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5.scope - libcontainer container 5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5. Sep 12 22:10:18.249356 containerd[1931]: time="2025-09-12T22:10:18.249280206Z" level=info msg="StartContainer for \"5e1bc772f66f930f1d3b3efcee8bb42070ec3f0659fd979f78e3ad89d9fe20f5\" returns successfully"