Sep 5 23:52:50.258507 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Sep 5 23:52:50.258551 kernel: Linux version 6.6.103-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 5 22:30:47 -00 2025 Sep 5 23:52:50.258576 kernel: KASLR disabled due to lack of seed Sep 5 23:52:50.258593 kernel: efi: EFI v2.7 by EDK II Sep 5 23:52:50.258609 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7affea98 MEMRESERVE=0x7852ee18 Sep 5 23:52:50.258624 kernel: ACPI: Early table checksum verification disabled Sep 5 23:52:50.258642 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Sep 5 23:52:50.258657 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Sep 5 23:52:50.258673 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 5 23:52:50.258689 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Sep 5 23:52:50.258709 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 5 23:52:50.258725 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Sep 5 23:52:50.258741 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Sep 5 23:52:50.258757 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Sep 5 23:52:50.258775 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 5 23:52:50.258795 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Sep 5 23:52:50.258813 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Sep 5 23:52:50.258829 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Sep 5 23:52:50.258845 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Sep 5 23:52:50.258862 kernel: printk: bootconsole [uart0] enabled Sep 5 23:52:50.258878 kernel: NUMA: Failed to initialise from firmware Sep 5 23:52:50.258895 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Sep 5 23:52:50.258911 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Sep 5 23:52:50.258927 kernel: Zone ranges: Sep 5 23:52:50.258944 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 5 23:52:50.258960 kernel: DMA32 empty Sep 5 23:52:50.258980 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Sep 5 23:52:50.258996 kernel: Movable zone start for each node Sep 5 23:52:50.259012 kernel: Early memory node ranges Sep 5 23:52:50.259029 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Sep 5 23:52:50.259045 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Sep 5 23:52:50.259061 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Sep 5 23:52:50.259078 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Sep 5 23:52:50.259094 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Sep 5 23:52:50.259110 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Sep 5 23:52:50.259126 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Sep 5 23:52:50.259142 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Sep 5 23:52:50.259209 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Sep 5 23:52:50.259238 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Sep 5 23:52:50.259256 kernel: psci: probing for conduit method from ACPI. Sep 5 23:52:50.259280 kernel: psci: PSCIv1.0 detected in firmware. Sep 5 23:52:50.259298 kernel: psci: Using standard PSCI v0.2 function IDs Sep 5 23:52:50.259316 kernel: psci: Trusted OS migration not required Sep 5 23:52:50.259338 kernel: psci: SMC Calling Convention v1.1 Sep 5 23:52:50.259356 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Sep 5 23:52:50.259374 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 5 23:52:50.259391 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 5 23:52:50.259409 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 5 23:52:50.259426 kernel: Detected PIPT I-cache on CPU0 Sep 5 23:52:50.259443 kernel: CPU features: detected: GIC system register CPU interface Sep 5 23:52:50.259461 kernel: CPU features: detected: Spectre-v2 Sep 5 23:52:50.259478 kernel: CPU features: detected: Spectre-v3a Sep 5 23:52:50.259495 kernel: CPU features: detected: Spectre-BHB Sep 5 23:52:50.259513 kernel: CPU features: detected: ARM erratum 1742098 Sep 5 23:52:50.259534 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Sep 5 23:52:50.259552 kernel: alternatives: applying boot alternatives Sep 5 23:52:50.259572 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 5 23:52:50.259590 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 5 23:52:50.259608 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 5 23:52:50.259626 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 5 23:52:50.259643 kernel: Fallback order for Node 0: 0 Sep 5 23:52:50.259661 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Sep 5 23:52:50.259678 kernel: Policy zone: Normal Sep 5 23:52:50.259696 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 5 23:52:50.259713 kernel: software IO TLB: area num 2. Sep 5 23:52:50.259735 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Sep 5 23:52:50.259753 kernel: Memory: 3820088K/4030464K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39424K init, 897K bss, 210376K reserved, 0K cma-reserved) Sep 5 23:52:50.259771 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 5 23:52:50.259789 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 5 23:52:50.259807 kernel: rcu: RCU event tracing is enabled. Sep 5 23:52:50.259825 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 5 23:52:50.259843 kernel: Trampoline variant of Tasks RCU enabled. Sep 5 23:52:50.259861 kernel: Tracing variant of Tasks RCU enabled. Sep 5 23:52:50.259879 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 5 23:52:50.259896 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 5 23:52:50.259914 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 5 23:52:50.259936 kernel: GICv3: 96 SPIs implemented Sep 5 23:52:50.259953 kernel: GICv3: 0 Extended SPIs implemented Sep 5 23:52:50.259971 kernel: Root IRQ handler: gic_handle_irq Sep 5 23:52:50.259988 kernel: GICv3: GICv3 features: 16 PPIs Sep 5 23:52:50.260005 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Sep 5 23:52:50.260023 kernel: ITS [mem 0x10080000-0x1009ffff] Sep 5 23:52:50.260040 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Sep 5 23:52:50.260059 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Sep 5 23:52:50.260076 kernel: GICv3: using LPI property table @0x00000004000d0000 Sep 5 23:52:50.260094 kernel: ITS: Using hypervisor restricted LPI range [128] Sep 5 23:52:50.260111 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Sep 5 23:52:50.260129 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 5 23:52:50.260151 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Sep 5 23:52:50.266257 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Sep 5 23:52:50.266288 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Sep 5 23:52:50.266307 kernel: Console: colour dummy device 80x25 Sep 5 23:52:50.266326 kernel: printk: console [tty1] enabled Sep 5 23:52:50.266344 kernel: ACPI: Core revision 20230628 Sep 5 23:52:50.266363 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Sep 5 23:52:50.266381 kernel: pid_max: default: 32768 minimum: 301 Sep 5 23:52:50.266399 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 5 23:52:50.266428 kernel: landlock: Up and running. Sep 5 23:52:50.266447 kernel: SELinux: Initializing. Sep 5 23:52:50.266465 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 23:52:50.266483 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 23:52:50.266501 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 5 23:52:50.266520 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 5 23:52:50.266537 kernel: rcu: Hierarchical SRCU implementation. Sep 5 23:52:50.266557 kernel: rcu: Max phase no-delay instances is 400. Sep 5 23:52:50.266575 kernel: Platform MSI: ITS@0x10080000 domain created Sep 5 23:52:50.266597 kernel: PCI/MSI: ITS@0x10080000 domain created Sep 5 23:52:50.266615 kernel: Remapping and enabling EFI services. Sep 5 23:52:50.266633 kernel: smp: Bringing up secondary CPUs ... Sep 5 23:52:50.266650 kernel: Detected PIPT I-cache on CPU1 Sep 5 23:52:50.266668 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Sep 5 23:52:50.266686 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Sep 5 23:52:50.266704 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Sep 5 23:52:50.266721 kernel: smp: Brought up 1 node, 2 CPUs Sep 5 23:52:50.266739 kernel: SMP: Total of 2 processors activated. Sep 5 23:52:50.266757 kernel: CPU features: detected: 32-bit EL0 Support Sep 5 23:52:50.266778 kernel: CPU features: detected: 32-bit EL1 Support Sep 5 23:52:50.266796 kernel: CPU features: detected: CRC32 instructions Sep 5 23:52:50.266825 kernel: CPU: All CPU(s) started at EL1 Sep 5 23:52:50.266848 kernel: alternatives: applying system-wide alternatives Sep 5 23:52:50.266866 kernel: devtmpfs: initialized Sep 5 23:52:50.266885 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 5 23:52:50.266903 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 5 23:52:50.266922 kernel: pinctrl core: initialized pinctrl subsystem Sep 5 23:52:50.266941 kernel: SMBIOS 3.0.0 present. Sep 5 23:52:50.266963 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Sep 5 23:52:50.266981 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 5 23:52:50.267000 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 5 23:52:50.267019 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 5 23:52:50.267038 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 5 23:52:50.267056 kernel: audit: initializing netlink subsys (disabled) Sep 5 23:52:50.267075 kernel: audit: type=2000 audit(0.286:1): state=initialized audit_enabled=0 res=1 Sep 5 23:52:50.267097 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 5 23:52:50.267116 kernel: cpuidle: using governor menu Sep 5 23:52:50.267134 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 5 23:52:50.267153 kernel: ASID allocator initialised with 65536 entries Sep 5 23:52:50.267199 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 5 23:52:50.267221 kernel: Serial: AMBA PL011 UART driver Sep 5 23:52:50.267240 kernel: Modules: 17488 pages in range for non-PLT usage Sep 5 23:52:50.267259 kernel: Modules: 509008 pages in range for PLT usage Sep 5 23:52:50.267278 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 5 23:52:50.267303 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 5 23:52:50.267322 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 5 23:52:50.267341 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 5 23:52:50.267359 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 5 23:52:50.267378 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 5 23:52:50.267396 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 5 23:52:50.267415 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 5 23:52:50.267434 kernel: ACPI: Added _OSI(Module Device) Sep 5 23:52:50.267452 kernel: ACPI: Added _OSI(Processor Device) Sep 5 23:52:50.267475 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 5 23:52:50.267493 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 5 23:52:50.267512 kernel: ACPI: Interpreter enabled Sep 5 23:52:50.267530 kernel: ACPI: Using GIC for interrupt routing Sep 5 23:52:50.267549 kernel: ACPI: MCFG table detected, 1 entries Sep 5 23:52:50.267567 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Sep 5 23:52:50.267865 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 5 23:52:50.268094 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 5 23:52:50.268340 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 5 23:52:50.268538 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Sep 5 23:52:50.268732 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Sep 5 23:52:50.268758 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Sep 5 23:52:50.268777 kernel: acpiphp: Slot [1] registered Sep 5 23:52:50.268796 kernel: acpiphp: Slot [2] registered Sep 5 23:52:50.268815 kernel: acpiphp: Slot [3] registered Sep 5 23:52:50.268833 kernel: acpiphp: Slot [4] registered Sep 5 23:52:50.268858 kernel: acpiphp: Slot [5] registered Sep 5 23:52:50.268877 kernel: acpiphp: Slot [6] registered Sep 5 23:52:50.268895 kernel: acpiphp: Slot [7] registered Sep 5 23:52:50.268914 kernel: acpiphp: Slot [8] registered Sep 5 23:52:50.268932 kernel: acpiphp: Slot [9] registered Sep 5 23:52:50.268950 kernel: acpiphp: Slot [10] registered Sep 5 23:52:50.268969 kernel: acpiphp: Slot [11] registered Sep 5 23:52:50.268987 kernel: acpiphp: Slot [12] registered Sep 5 23:52:50.269005 kernel: acpiphp: Slot [13] registered Sep 5 23:52:50.269025 kernel: acpiphp: Slot [14] registered Sep 5 23:52:50.269047 kernel: acpiphp: Slot [15] registered Sep 5 23:52:50.269066 kernel: acpiphp: Slot [16] registered Sep 5 23:52:50.269084 kernel: acpiphp: Slot [17] registered Sep 5 23:52:50.269102 kernel: acpiphp: Slot [18] registered Sep 5 23:52:50.269121 kernel: acpiphp: Slot [19] registered Sep 5 23:52:50.269139 kernel: acpiphp: Slot [20] registered Sep 5 23:52:50.269157 kernel: acpiphp: Slot [21] registered Sep 5 23:52:50.269197 kernel: acpiphp: Slot [22] registered Sep 5 23:52:50.269217 kernel: acpiphp: Slot [23] registered Sep 5 23:52:50.269241 kernel: acpiphp: Slot [24] registered Sep 5 23:52:50.269260 kernel: acpiphp: Slot [25] registered Sep 5 23:52:50.269278 kernel: acpiphp: Slot [26] registered Sep 5 23:52:50.269297 kernel: acpiphp: Slot [27] registered Sep 5 23:52:50.269315 kernel: acpiphp: Slot [28] registered Sep 5 23:52:50.269333 kernel: acpiphp: Slot [29] registered Sep 5 23:52:50.269352 kernel: acpiphp: Slot [30] registered Sep 5 23:52:50.269370 kernel: acpiphp: Slot [31] registered Sep 5 23:52:50.269388 kernel: PCI host bridge to bus 0000:00 Sep 5 23:52:50.271741 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Sep 5 23:52:50.271933 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 5 23:52:50.272123 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Sep 5 23:52:50.272331 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Sep 5 23:52:50.272564 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Sep 5 23:52:50.272789 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Sep 5 23:52:50.272998 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Sep 5 23:52:50.273258 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Sep 5 23:52:50.273487 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Sep 5 23:52:50.273696 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 5 23:52:50.273913 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Sep 5 23:52:50.274119 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Sep 5 23:52:50.274364 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Sep 5 23:52:50.274577 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Sep 5 23:52:50.274782 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 5 23:52:50.274985 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Sep 5 23:52:50.275209 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Sep 5 23:52:50.275427 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Sep 5 23:52:50.275632 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Sep 5 23:52:50.275839 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Sep 5 23:52:50.276024 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Sep 5 23:52:50.276234 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 5 23:52:50.276421 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Sep 5 23:52:50.276447 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 5 23:52:50.276467 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 5 23:52:50.276486 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 5 23:52:50.276505 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 5 23:52:50.276524 kernel: iommu: Default domain type: Translated Sep 5 23:52:50.276543 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 5 23:52:50.276568 kernel: efivars: Registered efivars operations Sep 5 23:52:50.276587 kernel: vgaarb: loaded Sep 5 23:52:50.276605 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 5 23:52:50.276624 kernel: VFS: Disk quotas dquot_6.6.0 Sep 5 23:52:50.276642 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 5 23:52:50.276661 kernel: pnp: PnP ACPI init Sep 5 23:52:50.276867 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Sep 5 23:52:50.276894 kernel: pnp: PnP ACPI: found 1 devices Sep 5 23:52:50.276918 kernel: NET: Registered PF_INET protocol family Sep 5 23:52:50.276938 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 5 23:52:50.276957 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 5 23:52:50.276975 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 5 23:52:50.276994 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 5 23:52:50.277013 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 5 23:52:50.277032 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 5 23:52:50.277050 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 23:52:50.277069 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 23:52:50.277092 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 5 23:52:50.277110 kernel: PCI: CLS 0 bytes, default 64 Sep 5 23:52:50.277128 kernel: kvm [1]: HYP mode not available Sep 5 23:52:50.277147 kernel: Initialise system trusted keyrings Sep 5 23:52:50.277185 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 5 23:52:50.277206 kernel: Key type asymmetric registered Sep 5 23:52:50.277225 kernel: Asymmetric key parser 'x509' registered Sep 5 23:52:50.277244 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 5 23:52:50.277263 kernel: io scheduler mq-deadline registered Sep 5 23:52:50.277287 kernel: io scheduler kyber registered Sep 5 23:52:50.277305 kernel: io scheduler bfq registered Sep 5 23:52:50.277602 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Sep 5 23:52:50.277634 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 5 23:52:50.277654 kernel: ACPI: button: Power Button [PWRB] Sep 5 23:52:50.277673 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Sep 5 23:52:50.277692 kernel: ACPI: button: Sleep Button [SLPB] Sep 5 23:52:50.277711 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 5 23:52:50.277738 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 5 23:52:50.277948 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Sep 5 23:52:50.277975 kernel: printk: console [ttyS0] disabled Sep 5 23:52:50.277995 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Sep 5 23:52:50.278014 kernel: printk: console [ttyS0] enabled Sep 5 23:52:50.278032 kernel: printk: bootconsole [uart0] disabled Sep 5 23:52:50.278051 kernel: thunder_xcv, ver 1.0 Sep 5 23:52:50.278069 kernel: thunder_bgx, ver 1.0 Sep 5 23:52:50.278088 kernel: nicpf, ver 1.0 Sep 5 23:52:50.278111 kernel: nicvf, ver 1.0 Sep 5 23:52:50.278399 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 5 23:52:50.278595 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-05T23:52:49 UTC (1757116369) Sep 5 23:52:50.278622 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 5 23:52:50.278642 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Sep 5 23:52:50.278661 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 5 23:52:50.278680 kernel: watchdog: Hard watchdog permanently disabled Sep 5 23:52:50.278699 kernel: NET: Registered PF_INET6 protocol family Sep 5 23:52:50.278726 kernel: Segment Routing with IPv6 Sep 5 23:52:50.278745 kernel: In-situ OAM (IOAM) with IPv6 Sep 5 23:52:50.278866 kernel: NET: Registered PF_PACKET protocol family Sep 5 23:52:50.278892 kernel: Key type dns_resolver registered Sep 5 23:52:50.278911 kernel: registered taskstats version 1 Sep 5 23:52:50.278930 kernel: Loading compiled-in X.509 certificates Sep 5 23:52:50.278949 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.103-flatcar: 5b16e1dfa86dac534548885fd675b87757ff9e20' Sep 5 23:52:50.278968 kernel: Key type .fscrypt registered Sep 5 23:52:50.278986 kernel: Key type fscrypt-provisioning registered Sep 5 23:52:50.279010 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 5 23:52:50.279030 kernel: ima: Allocated hash algorithm: sha1 Sep 5 23:52:50.279049 kernel: ima: No architecture policies found Sep 5 23:52:50.279067 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 5 23:52:50.279086 kernel: clk: Disabling unused clocks Sep 5 23:52:50.279104 kernel: Freeing unused kernel memory: 39424K Sep 5 23:52:50.279123 kernel: Run /init as init process Sep 5 23:52:50.279141 kernel: with arguments: Sep 5 23:52:50.279285 kernel: /init Sep 5 23:52:50.279311 kernel: with environment: Sep 5 23:52:50.279336 kernel: HOME=/ Sep 5 23:52:50.279355 kernel: TERM=linux Sep 5 23:52:50.279374 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 5 23:52:50.279398 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 5 23:52:50.279422 systemd[1]: Detected virtualization amazon. Sep 5 23:52:50.279442 systemd[1]: Detected architecture arm64. Sep 5 23:52:50.279462 systemd[1]: Running in initrd. Sep 5 23:52:50.279487 systemd[1]: No hostname configured, using default hostname. Sep 5 23:52:50.279507 systemd[1]: Hostname set to . Sep 5 23:52:50.279527 systemd[1]: Initializing machine ID from VM UUID. Sep 5 23:52:50.279547 systemd[1]: Queued start job for default target initrd.target. Sep 5 23:52:50.279567 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:52:50.279588 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:52:50.279609 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 5 23:52:50.279630 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 23:52:50.279655 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 5 23:52:50.279676 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 5 23:52:50.279699 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 5 23:52:50.279720 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 5 23:52:50.279741 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:52:50.279762 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:52:50.279782 systemd[1]: Reached target paths.target - Path Units. Sep 5 23:52:50.279806 systemd[1]: Reached target slices.target - Slice Units. Sep 5 23:52:50.279827 systemd[1]: Reached target swap.target - Swaps. Sep 5 23:52:50.279847 systemd[1]: Reached target timers.target - Timer Units. Sep 5 23:52:50.279868 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 23:52:50.279889 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 23:52:50.279909 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 5 23:52:50.279930 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 5 23:52:50.279950 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:52:50.279970 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 23:52:50.279995 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:52:50.280015 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 23:52:50.280035 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 5 23:52:50.280056 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 23:52:50.280076 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 5 23:52:50.280097 systemd[1]: Starting systemd-fsck-usr.service... Sep 5 23:52:50.280118 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 23:52:50.280138 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 23:52:50.280182 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:52:50.280209 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 5 23:52:50.280230 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:52:50.280295 systemd-journald[250]: Collecting audit messages is disabled. Sep 5 23:52:50.280344 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 5 23:52:50.280365 systemd[1]: Finished systemd-fsck-usr.service. Sep 5 23:52:50.280386 kernel: Bridge firewalling registered Sep 5 23:52:50.280406 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 23:52:50.280431 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:52:50.280452 systemd-journald[250]: Journal started Sep 5 23:52:50.280489 systemd-journald[250]: Runtime Journal (/run/log/journal/ec249fa88716fdd38ab66e9751d48a6d) is 8.0M, max 75.3M, 67.3M free. Sep 5 23:52:50.285279 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:52:50.221749 systemd-modules-load[251]: Inserted module 'overlay' Sep 5 23:52:50.252283 systemd-modules-load[251]: Inserted module 'br_netfilter' Sep 5 23:52:50.305779 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 23:52:50.319027 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 23:52:50.319101 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 23:52:50.338286 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:52:50.345852 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:52:50.356559 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:52:50.370402 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 5 23:52:50.382437 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 23:52:50.396322 dracut-cmdline[280]: dracut-dracut-053 Sep 5 23:52:50.401975 dracut-cmdline[280]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 5 23:52:50.443984 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 23:52:50.454039 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:52:50.469531 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 23:52:50.481479 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:52:50.553701 systemd-resolved[311]: Positive Trust Anchors: Sep 5 23:52:50.553738 systemd-resolved[311]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 23:52:50.553803 systemd-resolved[311]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 23:52:50.598203 kernel: SCSI subsystem initialized Sep 5 23:52:50.608185 kernel: Loading iSCSI transport class v2.0-870. Sep 5 23:52:50.619201 kernel: iscsi: registered transport (tcp) Sep 5 23:52:50.641691 kernel: iscsi: registered transport (qla4xxx) Sep 5 23:52:50.641764 kernel: QLogic iSCSI HBA Driver Sep 5 23:52:50.755204 kernel: random: crng init done Sep 5 23:52:50.755771 systemd-resolved[311]: Defaulting to hostname 'linux'. Sep 5 23:52:50.759838 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 23:52:50.765719 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:52:50.785910 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 5 23:52:50.797528 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 5 23:52:50.840213 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 5 23:52:50.840308 kernel: device-mapper: uevent: version 1.0.3 Sep 5 23:52:50.843215 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 5 23:52:50.910206 kernel: raid6: neonx8 gen() 6653 MB/s Sep 5 23:52:50.928196 kernel: raid6: neonx4 gen() 6443 MB/s Sep 5 23:52:50.945195 kernel: raid6: neonx2 gen() 5375 MB/s Sep 5 23:52:50.963196 kernel: raid6: neonx1 gen() 3924 MB/s Sep 5 23:52:50.980195 kernel: raid6: int64x8 gen() 3798 MB/s Sep 5 23:52:50.998195 kernel: raid6: int64x4 gen() 3681 MB/s Sep 5 23:52:51.015195 kernel: raid6: int64x2 gen() 3561 MB/s Sep 5 23:52:51.033180 kernel: raid6: int64x1 gen() 2767 MB/s Sep 5 23:52:51.033213 kernel: raid6: using algorithm neonx8 gen() 6653 MB/s Sep 5 23:52:51.052201 kernel: raid6: .... xor() 4924 MB/s, rmw enabled Sep 5 23:52:51.052237 kernel: raid6: using neon recovery algorithm Sep 5 23:52:51.060200 kernel: xor: measuring software checksum speed Sep 5 23:52:51.060251 kernel: 8regs : 10214 MB/sec Sep 5 23:52:51.063552 kernel: 32regs : 10995 MB/sec Sep 5 23:52:51.063585 kernel: arm64_neon : 9564 MB/sec Sep 5 23:52:51.063610 kernel: xor: using function: 32regs (10995 MB/sec) Sep 5 23:52:51.148208 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 5 23:52:51.166787 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 5 23:52:51.178548 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:52:51.214058 systemd-udevd[470]: Using default interface naming scheme 'v255'. Sep 5 23:52:51.222018 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:52:51.246505 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 5 23:52:51.271236 dracut-pre-trigger[486]: rd.md=0: removing MD RAID activation Sep 5 23:52:51.328491 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 23:52:51.339481 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 23:52:51.458337 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:52:51.470559 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 5 23:52:51.509611 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 5 23:52:51.515132 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 23:52:51.520688 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:52:51.523647 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 23:52:51.537510 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 5 23:52:51.577309 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 5 23:52:51.659687 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 5 23:52:51.659761 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Sep 5 23:52:51.664092 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 5 23:52:51.664498 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 5 23:52:51.675192 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:67:1d:b6:7c:55 Sep 5 23:52:51.675431 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 23:52:51.675679 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:52:51.685083 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:52:51.685231 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 23:52:51.687686 (udev-worker)[545]: Network interface NamePolicy= disabled on kernel command line. Sep 5 23:52:51.690486 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:52:51.701183 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:52:51.717518 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:52:51.737124 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 5 23:52:51.737217 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 5 23:52:51.750192 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 5 23:52:51.753523 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:52:51.762441 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 5 23:52:51.762504 kernel: GPT:9289727 != 16777215 Sep 5 23:52:51.765675 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 5 23:52:51.765747 kernel: GPT:9289727 != 16777215 Sep 5 23:52:51.765774 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 5 23:52:51.766717 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 5 23:52:51.767596 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:52:51.803391 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:52:51.874223 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/nvme0n1p6 scanned by (udev-worker) (521) Sep 5 23:52:51.885211 kernel: BTRFS: device fsid 045c118e-b098-46f0-884a-43665575c70e devid 1 transid 37 /dev/nvme0n1p3 scanned by (udev-worker) (541) Sep 5 23:52:51.984487 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 5 23:52:52.002929 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 5 23:52:52.018478 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 5 23:52:52.034810 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 5 23:52:52.037411 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 5 23:52:52.056990 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 5 23:52:52.068435 disk-uuid[664]: Primary Header is updated. Sep 5 23:52:52.068435 disk-uuid[664]: Secondary Entries is updated. Sep 5 23:52:52.068435 disk-uuid[664]: Secondary Header is updated. Sep 5 23:52:52.078195 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 5 23:52:52.089957 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 5 23:52:52.096216 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 5 23:52:53.099201 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 5 23:52:53.102069 disk-uuid[665]: The operation has completed successfully. Sep 5 23:52:53.276178 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 5 23:52:53.279295 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 5 23:52:53.336467 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 5 23:52:53.351151 sh[1008]: Success Sep 5 23:52:53.370220 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 5 23:52:53.463871 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 5 23:52:53.476444 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 5 23:52:53.485934 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 5 23:52:53.532642 kernel: BTRFS info (device dm-0): first mount of filesystem 045c118e-b098-46f0-884a-43665575c70e Sep 5 23:52:53.532704 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:52:53.534626 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 5 23:52:53.534672 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 5 23:52:53.537108 kernel: BTRFS info (device dm-0): using free space tree Sep 5 23:52:53.644193 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 5 23:52:53.682331 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 5 23:52:53.686700 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 5 23:52:53.701560 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 5 23:52:53.709597 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 5 23:52:53.737459 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:52:53.737542 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:52:53.737574 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 5 23:52:53.756205 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 5 23:52:53.774229 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 5 23:52:53.778219 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:52:53.787561 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 5 23:52:53.804628 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 5 23:52:53.893970 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 23:52:53.905537 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 23:52:53.970950 systemd-networkd[1200]: lo: Link UP Sep 5 23:52:53.970964 systemd-networkd[1200]: lo: Gained carrier Sep 5 23:52:53.974812 systemd-networkd[1200]: Enumeration completed Sep 5 23:52:53.974962 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 23:52:53.975932 systemd-networkd[1200]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:52:53.975938 systemd-networkd[1200]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:52:53.977661 systemd[1]: Reached target network.target - Network. Sep 5 23:52:53.996035 systemd-networkd[1200]: eth0: Link UP Sep 5 23:52:53.996048 systemd-networkd[1200]: eth0: Gained carrier Sep 5 23:52:53.996066 systemd-networkd[1200]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:52:54.025249 systemd-networkd[1200]: eth0: DHCPv4 address 172.31.18.129/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 5 23:52:54.261431 ignition[1123]: Ignition 2.19.0 Sep 5 23:52:54.261451 ignition[1123]: Stage: fetch-offline Sep 5 23:52:54.263079 ignition[1123]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:54.268732 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 23:52:54.263104 ignition[1123]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:54.264035 ignition[1123]: Ignition finished successfully Sep 5 23:52:54.282658 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 5 23:52:54.311513 ignition[1211]: Ignition 2.19.0 Sep 5 23:52:54.311540 ignition[1211]: Stage: fetch Sep 5 23:52:54.313442 ignition[1211]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:54.313470 ignition[1211]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:54.314685 ignition[1211]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:54.330109 ignition[1211]: PUT result: OK Sep 5 23:52:54.333374 ignition[1211]: parsed url from cmdline: "" Sep 5 23:52:54.333526 ignition[1211]: no config URL provided Sep 5 23:52:54.333546 ignition[1211]: reading system config file "/usr/lib/ignition/user.ign" Sep 5 23:52:54.333572 ignition[1211]: no config at "/usr/lib/ignition/user.ign" Sep 5 23:52:54.333604 ignition[1211]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:54.339821 ignition[1211]: PUT result: OK Sep 5 23:52:54.339927 ignition[1211]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 5 23:52:54.342678 ignition[1211]: GET result: OK Sep 5 23:52:54.342852 ignition[1211]: parsing config with SHA512: 65a706c082e59ef121428f1201e551dcf9ac018823cedfc3bac9c3cd2402c6366f5fd608b53738539fe46d4524ed2cbbc89451905084d638a0cfa3a5f4a11917 Sep 5 23:52:54.358092 unknown[1211]: fetched base config from "system" Sep 5 23:52:54.359607 unknown[1211]: fetched base config from "system" Sep 5 23:52:54.359623 unknown[1211]: fetched user config from "aws" Sep 5 23:52:54.360587 ignition[1211]: fetch: fetch complete Sep 5 23:52:54.360599 ignition[1211]: fetch: fetch passed Sep 5 23:52:54.360699 ignition[1211]: Ignition finished successfully Sep 5 23:52:54.368980 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 5 23:52:54.383476 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 5 23:52:54.413051 ignition[1217]: Ignition 2.19.0 Sep 5 23:52:54.413591 ignition[1217]: Stage: kargs Sep 5 23:52:54.414283 ignition[1217]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:54.414308 ignition[1217]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:54.414490 ignition[1217]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:54.420480 ignition[1217]: PUT result: OK Sep 5 23:52:54.427971 ignition[1217]: kargs: kargs passed Sep 5 23:52:54.428073 ignition[1217]: Ignition finished successfully Sep 5 23:52:54.432434 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 5 23:52:54.444464 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 5 23:52:54.470880 ignition[1223]: Ignition 2.19.0 Sep 5 23:52:54.470902 ignition[1223]: Stage: disks Sep 5 23:52:54.475127 ignition[1223]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:54.475194 ignition[1223]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:54.475356 ignition[1223]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:54.480223 ignition[1223]: PUT result: OK Sep 5 23:52:54.487817 ignition[1223]: disks: disks passed Sep 5 23:52:54.488127 ignition[1223]: Ignition finished successfully Sep 5 23:52:54.494552 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 5 23:52:54.495060 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 5 23:52:54.501775 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 5 23:52:54.504541 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 23:52:54.506753 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 23:52:54.508978 systemd[1]: Reached target basic.target - Basic System. Sep 5 23:52:54.525123 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 5 23:52:54.582365 systemd-fsck[1231]: ROOT: clean, 14/553520 files, 52654/553472 blocks Sep 5 23:52:54.588004 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 5 23:52:54.600669 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 5 23:52:54.687234 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 72e55cb0-8368-4871-a3a0-8637412e72e8 r/w with ordered data mode. Quota mode: none. Sep 5 23:52:54.688787 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 5 23:52:54.692813 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 5 23:52:54.708370 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 23:52:54.712436 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 5 23:52:54.722713 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 5 23:52:54.722799 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 5 23:52:54.722849 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 23:52:54.743257 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 5 23:52:54.749073 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/nvme0n1p6 scanned by mount (1250) Sep 5 23:52:54.753873 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:52:54.753951 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:52:54.753979 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 5 23:52:54.756469 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 5 23:52:54.776207 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 5 23:52:54.778913 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 23:52:55.109341 initrd-setup-root[1274]: cut: /sysroot/etc/passwd: No such file or directory Sep 5 23:52:55.130522 initrd-setup-root[1281]: cut: /sysroot/etc/group: No such file or directory Sep 5 23:52:55.139818 initrd-setup-root[1288]: cut: /sysroot/etc/shadow: No such file or directory Sep 5 23:52:55.148654 initrd-setup-root[1295]: cut: /sysroot/etc/gshadow: No such file or directory Sep 5 23:52:55.557242 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 5 23:52:55.568513 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 5 23:52:55.575420 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 5 23:52:55.592564 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 5 23:52:55.596593 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:52:55.642312 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 5 23:52:55.650543 ignition[1363]: INFO : Ignition 2.19.0 Sep 5 23:52:55.650543 ignition[1363]: INFO : Stage: mount Sep 5 23:52:55.658617 ignition[1363]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:55.658617 ignition[1363]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:55.658617 ignition[1363]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:55.658617 ignition[1363]: INFO : PUT result: OK Sep 5 23:52:55.673338 ignition[1363]: INFO : mount: mount passed Sep 5 23:52:55.673338 ignition[1363]: INFO : Ignition finished successfully Sep 5 23:52:55.668242 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 5 23:52:55.690514 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 5 23:52:55.709299 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 23:52:55.740190 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 scanned by mount (1375) Sep 5 23:52:55.745194 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:52:55.745242 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:52:55.745276 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 5 23:52:55.751207 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 5 23:52:55.754516 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 23:52:55.786415 systemd-networkd[1200]: eth0: Gained IPv6LL Sep 5 23:52:55.794982 ignition[1392]: INFO : Ignition 2.19.0 Sep 5 23:52:55.797275 ignition[1392]: INFO : Stage: files Sep 5 23:52:55.799401 ignition[1392]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:55.801706 ignition[1392]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:55.804353 ignition[1392]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:55.808739 ignition[1392]: INFO : PUT result: OK Sep 5 23:52:55.816341 ignition[1392]: DEBUG : files: compiled without relabeling support, skipping Sep 5 23:52:55.821893 ignition[1392]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 5 23:52:55.821893 ignition[1392]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 5 23:52:55.859127 ignition[1392]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 5 23:52:55.863909 ignition[1392]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 5 23:52:55.867213 unknown[1392]: wrote ssh authorized keys file for user: core Sep 5 23:52:55.870150 ignition[1392]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 5 23:52:55.873030 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 5 23:52:55.873030 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 5 23:52:56.014129 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 5 23:52:56.359896 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 5 23:52:56.364067 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 5 23:52:56.368421 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 5 23:52:56.372343 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 5 23:52:56.376091 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 5 23:52:56.376091 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 23:52:56.383606 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 23:52:56.383606 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 23:52:56.391398 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 23:52:56.395266 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 23:52:56.399268 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 23:52:56.403117 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:52:56.408809 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:52:56.414314 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:52:56.414314 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 5 23:52:56.866781 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 5 23:52:57.266352 ignition[1392]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 5 23:52:57.271194 ignition[1392]: INFO : files: files passed Sep 5 23:52:57.271194 ignition[1392]: INFO : Ignition finished successfully Sep 5 23:52:57.292757 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 5 23:52:57.316509 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 5 23:52:57.326859 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 5 23:52:57.335834 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 5 23:52:57.337601 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 5 23:52:57.373323 initrd-setup-root-after-ignition[1421]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:52:57.373323 initrd-setup-root-after-ignition[1421]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:52:57.380923 initrd-setup-root-after-ignition[1425]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:52:57.388963 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 23:52:57.392183 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 5 23:52:57.411502 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 5 23:52:57.470497 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 5 23:52:57.470687 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 5 23:52:57.474153 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 5 23:52:57.476913 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 5 23:52:57.479677 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 5 23:52:57.481876 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 5 23:52:57.529680 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 23:52:57.540474 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 5 23:52:57.565519 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:52:57.565878 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:52:57.566762 systemd[1]: Stopped target timers.target - Timer Units. Sep 5 23:52:57.567528 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 5 23:52:57.567757 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 23:52:57.569184 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 5 23:52:57.570010 systemd[1]: Stopped target basic.target - Basic System. Sep 5 23:52:57.571084 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 5 23:52:57.571812 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 23:52:57.572201 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 5 23:52:57.572545 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 5 23:52:57.572897 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 23:52:57.573304 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 5 23:52:57.573776 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 5 23:52:57.574588 systemd[1]: Stopped target swap.target - Swaps. Sep 5 23:52:57.575356 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 5 23:52:57.575563 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 5 23:52:57.577010 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:52:57.577891 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:52:57.578645 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 5 23:52:57.611195 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:52:57.643932 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 5 23:52:57.644177 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 5 23:52:57.665550 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 5 23:52:57.665832 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 23:52:57.671027 systemd[1]: ignition-files.service: Deactivated successfully. Sep 5 23:52:57.671256 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 5 23:52:57.686563 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 5 23:52:57.693865 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 5 23:52:57.699333 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 5 23:52:57.699640 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:52:57.708504 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 5 23:52:57.710955 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 23:52:57.732770 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 5 23:52:57.734905 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 5 23:52:57.744790 ignition[1445]: INFO : Ignition 2.19.0 Sep 5 23:52:57.747360 ignition[1445]: INFO : Stage: umount Sep 5 23:52:57.751005 ignition[1445]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:52:57.751005 ignition[1445]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 5 23:52:57.756591 ignition[1445]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 5 23:52:57.759739 ignition[1445]: INFO : PUT result: OK Sep 5 23:52:57.766046 ignition[1445]: INFO : umount: umount passed Sep 5 23:52:57.769887 ignition[1445]: INFO : Ignition finished successfully Sep 5 23:52:57.773825 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 5 23:52:57.777259 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 5 23:52:57.782631 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 5 23:52:57.783782 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 5 23:52:57.783927 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 5 23:52:57.787356 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 5 23:52:57.787453 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 5 23:52:57.799064 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 5 23:52:57.799188 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 5 23:52:57.801611 systemd[1]: Stopped target network.target - Network. Sep 5 23:52:57.803621 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 5 23:52:57.803713 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 23:52:57.806374 systemd[1]: Stopped target paths.target - Path Units. Sep 5 23:52:57.810270 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 5 23:52:57.818799 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:52:57.821488 systemd[1]: Stopped target slices.target - Slice Units. Sep 5 23:52:57.823446 systemd[1]: Stopped target sockets.target - Socket Units. Sep 5 23:52:57.825711 systemd[1]: iscsid.socket: Deactivated successfully. Sep 5 23:52:57.825792 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 23:52:57.835354 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 5 23:52:57.835433 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 23:52:57.838478 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 5 23:52:57.838567 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 5 23:52:57.840800 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 5 23:52:57.840880 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 5 23:52:57.843438 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 5 23:52:57.848824 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 5 23:52:57.853231 systemd-networkd[1200]: eth0: DHCPv6 lease lost Sep 5 23:52:57.853527 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 5 23:52:57.853722 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 5 23:52:57.880069 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 5 23:52:57.880310 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 5 23:52:57.889262 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 5 23:52:57.889438 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:52:57.901311 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 5 23:52:57.901446 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 5 23:52:57.913508 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 5 23:52:57.915528 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 5 23:52:57.915645 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 23:52:57.919207 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:52:57.923918 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 5 23:52:57.924132 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 5 23:52:57.944820 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 5 23:52:57.945129 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:52:57.949538 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 5 23:52:57.949642 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 5 23:52:57.952141 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 5 23:52:57.952246 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:52:57.979211 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 5 23:52:57.980087 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:52:57.989700 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 5 23:52:57.989832 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 5 23:52:57.994395 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 5 23:52:57.994467 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:52:57.996847 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 5 23:52:57.996938 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 5 23:52:57.999537 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 5 23:52:57.999628 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 5 23:52:58.002114 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 23:52:58.002220 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:52:58.018471 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 5 23:52:58.037535 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 5 23:52:58.037649 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:52:58.043401 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 5 23:52:58.043492 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:52:58.048941 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 5 23:52:58.049050 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:52:58.054034 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 23:52:58.057040 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:52:58.074768 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 5 23:52:58.075135 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 5 23:52:58.083375 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 5 23:52:58.083546 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 5 23:52:58.088343 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 5 23:52:58.107507 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 5 23:52:58.123432 systemd[1]: Switching root. Sep 5 23:52:58.174245 systemd-journald[250]: Journal stopped Sep 5 23:53:00.568214 systemd-journald[250]: Received SIGTERM from PID 1 (systemd). Sep 5 23:53:00.568344 kernel: SELinux: policy capability network_peer_controls=1 Sep 5 23:53:00.568388 kernel: SELinux: policy capability open_perms=1 Sep 5 23:53:00.568419 kernel: SELinux: policy capability extended_socket_class=1 Sep 5 23:53:00.568451 kernel: SELinux: policy capability always_check_network=0 Sep 5 23:53:00.568491 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 5 23:53:00.568530 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 5 23:53:00.568567 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 5 23:53:00.568599 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 5 23:53:00.568629 kernel: audit: type=1403 audit(1757116378.657:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 5 23:53:00.568668 systemd[1]: Successfully loaded SELinux policy in 62.281ms. Sep 5 23:53:00.568714 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 23.036ms. Sep 5 23:53:00.568750 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 5 23:53:00.568783 systemd[1]: Detected virtualization amazon. Sep 5 23:53:00.568815 systemd[1]: Detected architecture arm64. Sep 5 23:53:00.568880 systemd[1]: Detected first boot. Sep 5 23:53:00.568915 systemd[1]: Initializing machine ID from VM UUID. Sep 5 23:53:00.568948 zram_generator::config[1488]: No configuration found. Sep 5 23:53:00.568984 systemd[1]: Populated /etc with preset unit settings. Sep 5 23:53:00.569017 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 5 23:53:00.569081 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 5 23:53:00.569118 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 5 23:53:00.569149 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 5 23:53:00.569253 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 5 23:53:00.569297 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 5 23:53:00.569355 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 5 23:53:00.569395 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 5 23:53:00.569431 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 5 23:53:00.569465 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 5 23:53:00.569495 systemd[1]: Created slice user.slice - User and Session Slice. Sep 5 23:53:00.569537 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:53:00.569568 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:53:00.569605 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 5 23:53:00.569635 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 5 23:53:00.569665 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 5 23:53:00.569697 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 23:53:00.569729 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 5 23:53:00.569760 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:53:00.569793 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 5 23:53:00.569822 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 5 23:53:00.569852 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 5 23:53:00.569887 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 5 23:53:00.569924 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:53:00.569956 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 23:53:00.569988 systemd[1]: Reached target slices.target - Slice Units. Sep 5 23:53:00.570019 systemd[1]: Reached target swap.target - Swaps. Sep 5 23:53:00.570048 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 5 23:53:00.570078 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 5 23:53:00.570109 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:53:00.570145 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 23:53:00.570212 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:53:00.570275 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 5 23:53:00.570308 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 5 23:53:00.570337 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 5 23:53:00.570370 systemd[1]: Mounting media.mount - External Media Directory... Sep 5 23:53:00.570399 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 5 23:53:00.570432 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 5 23:53:00.570462 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 5 23:53:00.570502 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 5 23:53:00.570533 systemd[1]: Reached target machines.target - Containers. Sep 5 23:53:00.572872 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 5 23:53:00.572908 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:53:00.572939 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 23:53:00.572977 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 5 23:53:00.573007 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:53:00.573036 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 23:53:00.573070 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:53:00.573100 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 5 23:53:00.573129 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:53:00.573186 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 5 23:53:00.573221 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 5 23:53:00.573251 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 5 23:53:00.573283 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 5 23:53:00.573314 systemd[1]: Stopped systemd-fsck-usr.service. Sep 5 23:53:00.573365 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 23:53:00.573403 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 23:53:00.573433 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 23:53:00.573465 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 5 23:53:00.573497 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 23:53:00.573527 kernel: fuse: init (API version 7.39) Sep 5 23:53:00.573556 systemd[1]: verity-setup.service: Deactivated successfully. Sep 5 23:53:00.573587 systemd[1]: Stopped verity-setup.service. Sep 5 23:53:00.573620 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 5 23:53:00.573649 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 5 23:53:00.573684 systemd[1]: Mounted media.mount - External Media Directory. Sep 5 23:53:00.573713 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 5 23:53:00.573743 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 5 23:53:00.573774 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 5 23:53:00.573805 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:53:00.573838 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 5 23:53:00.573867 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 5 23:53:00.573896 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:53:00.573927 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:53:00.573956 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:53:00.573988 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:53:00.574017 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 5 23:53:00.574094 systemd-journald[1566]: Collecting audit messages is disabled. Sep 5 23:53:00.574147 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 5 23:53:00.574198 kernel: loop: module loaded Sep 5 23:53:00.574248 systemd-journald[1566]: Journal started Sep 5 23:53:00.574302 systemd-journald[1566]: Runtime Journal (/run/log/journal/ec249fa88716fdd38ab66e9751d48a6d) is 8.0M, max 75.3M, 67.3M free. Sep 5 23:52:59.917927 systemd[1]: Queued start job for default target multi-user.target. Sep 5 23:53:00.022939 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 5 23:53:00.023746 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 5 23:53:00.581708 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 23:53:00.580845 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:53:00.582323 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:53:00.587245 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 23:53:00.590894 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 23:53:00.595280 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 5 23:53:00.636484 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 23:53:00.642327 kernel: ACPI: bus type drm_connector registered Sep 5 23:53:00.646561 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 5 23:53:00.660477 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 5 23:53:00.663387 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 5 23:53:00.663444 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 23:53:00.669938 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 5 23:53:00.684651 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 5 23:53:00.692133 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 5 23:53:00.694720 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 23:53:00.709262 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 5 23:53:00.721517 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 5 23:53:00.724819 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 23:53:00.731237 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 5 23:53:00.732901 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 23:53:00.739028 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 23:53:00.745399 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 5 23:53:00.768724 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 23:53:00.777283 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 5 23:53:00.780923 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 23:53:00.781601 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 23:53:00.795956 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 5 23:53:00.798875 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 5 23:53:00.801968 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 5 23:53:00.830388 systemd-journald[1566]: Time spent on flushing to /var/log/journal/ec249fa88716fdd38ab66e9751d48a6d is 177.370ms for 908 entries. Sep 5 23:53:00.830388 systemd-journald[1566]: System Journal (/var/log/journal/ec249fa88716fdd38ab66e9751d48a6d) is 8.0M, max 195.6M, 187.6M free. Sep 5 23:53:01.026888 systemd-journald[1566]: Received client request to flush runtime journal. Sep 5 23:53:01.027865 kernel: loop0: detected capacity change from 0 to 52536 Sep 5 23:53:01.027929 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 5 23:53:00.841276 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 5 23:53:00.844576 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 5 23:53:00.854564 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 5 23:53:00.923270 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:53:00.934123 systemd-tmpfiles[1616]: ACLs are not supported, ignoring. Sep 5 23:53:00.934148 systemd-tmpfiles[1616]: ACLs are not supported, ignoring. Sep 5 23:53:00.955650 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:53:00.972597 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 5 23:53:01.030562 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 5 23:53:01.036382 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 5 23:53:01.045033 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 5 23:53:01.076205 kernel: loop1: detected capacity change from 0 to 114328 Sep 5 23:53:01.120139 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 5 23:53:01.126234 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:53:01.137897 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 23:53:01.154445 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 5 23:53:01.188345 udevadm[1640]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 5 23:53:01.199207 kernel: loop2: detected capacity change from 0 to 203944 Sep 5 23:53:01.200849 systemd-tmpfiles[1639]: ACLs are not supported, ignoring. Sep 5 23:53:01.201491 systemd-tmpfiles[1639]: ACLs are not supported, ignoring. Sep 5 23:53:01.211257 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:53:01.473237 kernel: loop3: detected capacity change from 0 to 114432 Sep 5 23:53:01.577221 kernel: loop4: detected capacity change from 0 to 52536 Sep 5 23:53:01.598206 kernel: loop5: detected capacity change from 0 to 114328 Sep 5 23:53:01.616487 kernel: loop6: detected capacity change from 0 to 203944 Sep 5 23:53:01.646219 kernel: loop7: detected capacity change from 0 to 114432 Sep 5 23:53:01.663891 (sd-merge)[1646]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 5 23:53:01.665138 (sd-merge)[1646]: Merged extensions into '/usr'. Sep 5 23:53:01.673586 systemd[1]: Reloading requested from client PID 1615 ('systemd-sysext') (unit systemd-sysext.service)... Sep 5 23:53:01.673618 systemd[1]: Reloading... Sep 5 23:53:01.852864 zram_generator::config[1672]: No configuration found. Sep 5 23:53:02.183974 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:53:02.294641 systemd[1]: Reloading finished in 619 ms. Sep 5 23:53:02.335797 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 5 23:53:02.341568 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 5 23:53:02.359419 systemd[1]: Starting ensure-sysext.service... Sep 5 23:53:02.372365 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 23:53:02.377905 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:53:02.404324 systemd[1]: Reloading requested from client PID 1724 ('systemctl') (unit ensure-sysext.service)... Sep 5 23:53:02.404357 systemd[1]: Reloading... Sep 5 23:53:02.456313 systemd-tmpfiles[1725]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 5 23:53:02.458863 systemd-tmpfiles[1725]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 5 23:53:02.460964 systemd-tmpfiles[1725]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 5 23:53:02.461853 systemd-tmpfiles[1725]: ACLs are not supported, ignoring. Sep 5 23:53:02.462106 systemd-tmpfiles[1725]: ACLs are not supported, ignoring. Sep 5 23:53:02.472020 systemd-tmpfiles[1725]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 23:53:02.472040 systemd-tmpfiles[1725]: Skipping /boot Sep 5 23:53:02.503270 systemd-tmpfiles[1725]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 23:53:02.503438 systemd-tmpfiles[1725]: Skipping /boot Sep 5 23:53:02.534044 ldconfig[1610]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 5 23:53:02.564473 systemd-udevd[1726]: Using default interface naming scheme 'v255'. Sep 5 23:53:02.612214 zram_generator::config[1754]: No configuration found. Sep 5 23:53:02.791365 (udev-worker)[1768]: Network interface NamePolicy= disabled on kernel command line. Sep 5 23:53:03.000224 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:53:03.116222 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (1786) Sep 5 23:53:03.135924 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 5 23:53:03.137435 systemd[1]: Reloading finished in 732 ms. Sep 5 23:53:03.171692 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:53:03.175892 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 5 23:53:03.187417 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:53:03.247296 systemd[1]: Finished ensure-sysext.service. Sep 5 23:53:03.298631 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 5 23:53:03.313529 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 5 23:53:03.316611 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:53:03.320651 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:53:03.327033 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 23:53:03.341635 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:53:03.350528 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:53:03.353132 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 23:53:03.358521 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 5 23:53:03.367003 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 23:53:03.376641 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 23:53:03.379053 systemd[1]: Reached target time-set.target - System Time Set. Sep 5 23:53:03.385534 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 5 23:53:03.398551 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:53:03.403133 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:53:03.406358 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:53:03.412524 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:53:03.413291 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:53:03.418731 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 23:53:03.450702 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 5 23:53:03.482549 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:53:03.482873 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:53:03.486614 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 23:53:03.495672 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 5 23:53:03.517875 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 5 23:53:03.520900 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 23:53:03.523253 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 23:53:03.542800 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 5 23:53:03.565972 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 5 23:53:03.578567 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 5 23:53:03.600367 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 5 23:53:03.621729 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 5 23:53:03.642698 augenrules[1961]: No rules Sep 5 23:53:03.646141 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 5 23:53:03.651030 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 5 23:53:03.667383 lvm[1953]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 5 23:53:03.691761 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 5 23:53:03.695551 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 5 23:53:03.704937 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 5 23:53:03.730877 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 5 23:53:03.734452 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:53:03.743478 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 5 23:53:03.751353 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 5 23:53:03.756263 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:53:03.776217 lvm[1973]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 5 23:53:03.828789 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 5 23:53:03.875466 systemd-networkd[1930]: lo: Link UP Sep 5 23:53:03.875491 systemd-networkd[1930]: lo: Gained carrier Sep 5 23:53:03.878217 systemd-networkd[1930]: Enumeration completed Sep 5 23:53:03.878424 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 23:53:03.883324 systemd-networkd[1930]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:53:03.883336 systemd-networkd[1930]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:53:03.887293 systemd-networkd[1930]: eth0: Link UP Sep 5 23:53:03.887667 systemd-networkd[1930]: eth0: Gained carrier Sep 5 23:53:03.887714 systemd-networkd[1930]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:53:03.889649 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 5 23:53:03.899333 systemd-networkd[1930]: eth0: DHCPv4 address 172.31.18.129/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 5 23:53:03.904058 systemd-resolved[1931]: Positive Trust Anchors: Sep 5 23:53:03.904418 systemd-resolved[1931]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 23:53:03.904482 systemd-resolved[1931]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 23:53:03.918219 systemd-resolved[1931]: Defaulting to hostname 'linux'. Sep 5 23:53:03.921953 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 23:53:03.924618 systemd[1]: Reached target network.target - Network. Sep 5 23:53:03.926647 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:53:03.929330 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 23:53:03.931811 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 5 23:53:03.934534 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 5 23:53:03.937715 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 5 23:53:03.940330 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 5 23:53:03.943080 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 5 23:53:03.945872 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 5 23:53:03.945921 systemd[1]: Reached target paths.target - Path Units. Sep 5 23:53:03.948012 systemd[1]: Reached target timers.target - Timer Units. Sep 5 23:53:03.951544 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 5 23:53:03.956664 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 5 23:53:03.966486 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 5 23:53:03.970083 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 5 23:53:03.972654 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 23:53:03.974852 systemd[1]: Reached target basic.target - Basic System. Sep 5 23:53:03.976978 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 5 23:53:03.977036 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 5 23:53:03.989969 systemd[1]: Starting containerd.service - containerd container runtime... Sep 5 23:53:03.996714 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 5 23:53:04.001586 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 5 23:53:04.015432 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 5 23:53:04.020515 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 5 23:53:04.022874 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 5 23:53:04.028572 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 5 23:53:04.036721 systemd[1]: Started ntpd.service - Network Time Service. Sep 5 23:53:04.051363 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 5 23:53:04.060144 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 5 23:53:04.069536 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 5 23:53:04.075512 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 5 23:53:04.094405 jq[1989]: false Sep 5 23:53:04.087422 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 5 23:53:04.092505 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 5 23:53:04.093427 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 5 23:53:04.098516 systemd[1]: Starting update-engine.service - Update Engine... Sep 5 23:53:04.107136 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 5 23:53:04.118896 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 5 23:53:04.119262 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 5 23:53:04.242581 jq[1999]: true Sep 5 23:53:04.243941 dbus-daemon[1988]: [system] SELinux support is enabled Sep 5 23:53:04.245154 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 5 23:53:04.256127 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 5 23:53:04.256274 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 5 23:53:04.260120 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 5 23:53:04.260185 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 5 23:53:04.263801 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 5 23:53:04.264220 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 5 23:53:04.283452 tar[2007]: linux-arm64/helm Sep 5 23:53:04.282665 dbus-daemon[1988]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1930 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 5 23:53:04.284056 dbus-daemon[1988]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 5 23:53:04.292514 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 5 23:53:04.318067 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 5 23:53:04.333968 ntpd[1992]: ntpd 4.2.8p17@1.4004-o Fri Sep 5 21:57:21 UTC 2025 (1): Starting Sep 5 23:53:04.334030 ntpd[1992]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: ntpd 4.2.8p17@1.4004-o Fri Sep 5 21:57:21 UTC 2025 (1): Starting Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: ---------------------------------------------------- Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: ntp-4 is maintained by Network Time Foundation, Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: corporation. Support and training for ntp-4 are Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: available at https://www.nwtime.org/support Sep 5 23:53:04.334537 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: ---------------------------------------------------- Sep 5 23:53:04.334052 ntpd[1992]: ---------------------------------------------------- Sep 5 23:53:04.334071 ntpd[1992]: ntp-4 is maintained by Network Time Foundation, Sep 5 23:53:04.334090 ntpd[1992]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 5 23:53:04.334108 ntpd[1992]: corporation. Support and training for ntp-4 are Sep 5 23:53:04.334127 ntpd[1992]: available at https://www.nwtime.org/support Sep 5 23:53:04.334145 ntpd[1992]: ---------------------------------------------------- Sep 5 23:53:04.351054 ntpd[1992]: proto: precision = 0.108 usec (-23) Sep 5 23:53:04.355484 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: proto: precision = 0.108 usec (-23) Sep 5 23:53:04.355484 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: basedate set to 2025-08-24 Sep 5 23:53:04.355484 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: gps base set to 2025-08-24 (week 2381) Sep 5 23:53:04.352576 ntpd[1992]: basedate set to 2025-08-24 Sep 5 23:53:04.352604 ntpd[1992]: gps base set to 2025-08-24 (week 2381) Sep 5 23:53:04.357106 ntpd[1992]: Listen and drop on 0 v6wildcard [::]:123 Sep 5 23:53:04.366853 (ntainerd)[2021]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listen and drop on 0 v6wildcard [::]:123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listen normally on 2 lo 127.0.0.1:123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listen normally on 3 eth0 172.31.18.129:123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listen normally on 4 lo [::1]:123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: bind(21) AF_INET6 fe80::467:1dff:feb6:7c55%2#123 flags 0x11 failed: Cannot assign requested address Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: unable to create socket on eth0 (5) for fe80::467:1dff:feb6:7c55%2#123 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: failed to init interface for address fe80::467:1dff:feb6:7c55%2 Sep 5 23:53:04.368978 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: Listening on routing socket on fd #21 for interface updates Sep 5 23:53:04.360304 ntpd[1992]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 5 23:53:04.360600 ntpd[1992]: Listen normally on 2 lo 127.0.0.1:123 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found loop4 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found loop5 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found loop6 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found loop7 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p1 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p2 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p3 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found usr Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p4 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p6 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p7 Sep 5 23:53:04.371468 extend-filesystems[1990]: Found nvme0n1p9 Sep 5 23:53:04.371468 extend-filesystems[1990]: Checking size of /dev/nvme0n1p9 Sep 5 23:53:04.360663 ntpd[1992]: Listen normally on 3 eth0 172.31.18.129:123 Sep 5 23:53:04.441633 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 5 23:53:04.441633 ntpd[1992]: 5 Sep 23:53:04 ntpd[1992]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 5 23:53:04.441722 jq[2020]: true Sep 5 23:53:04.528416 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 5 23:53:04.360729 ntpd[1992]: Listen normally on 4 lo [::1]:123 Sep 5 23:53:04.528632 extend-filesystems[1990]: Resized partition /dev/nvme0n1p9 Sep 5 23:53:04.556718 update_engine[1998]: I20250905 23:53:04.477178 1998 main.cc:92] Flatcar Update Engine starting Sep 5 23:53:04.556718 update_engine[1998]: I20250905 23:53:04.493455 1998 update_check_scheduler.cc:74] Next update check in 2m41s Sep 5 23:53:04.505220 systemd[1]: motdgen.service: Deactivated successfully. Sep 5 23:53:04.360807 ntpd[1992]: bind(21) AF_INET6 fe80::467:1dff:feb6:7c55%2#123 flags 0x11 failed: Cannot assign requested address Sep 5 23:53:04.569829 extend-filesystems[2037]: resize2fs 1.47.1 (20-May-2024) Sep 5 23:53:04.505865 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 5 23:53:04.360845 ntpd[1992]: unable to create socket on eth0 (5) for fe80::467:1dff:feb6:7c55%2#123 Sep 5 23:53:04.524930 systemd-logind[1997]: Watching system buttons on /dev/input/event0 (Power Button) Sep 5 23:53:04.360874 ntpd[1992]: failed to init interface for address fe80::467:1dff:feb6:7c55%2 Sep 5 23:53:04.524965 systemd-logind[1997]: Watching system buttons on /dev/input/event1 (Sleep Button) Sep 5 23:53:04.360922 ntpd[1992]: Listening on routing socket on fd #21 for interface updates Sep 5 23:53:04.525396 systemd-logind[1997]: New seat seat0. Sep 5 23:53:04.383016 ntpd[1992]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 5 23:53:04.526975 systemd[1]: Started update-engine.service - Update Engine. Sep 5 23:53:04.383071 ntpd[1992]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 5 23:53:04.545402 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 5 23:53:04.556953 systemd[1]: Started systemd-logind.service - User Login Management. Sep 5 23:53:04.646009 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 5 23:53:04.655485 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 5 23:53:04.657350 coreos-metadata[1987]: Sep 05 23:53:04.657 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 5 23:53:04.662884 coreos-metadata[1987]: Sep 05 23:53:04.662 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 5 23:53:04.669691 extend-filesystems[2037]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 5 23:53:04.669691 extend-filesystems[2037]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 5 23:53:04.669691 extend-filesystems[2037]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.668 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.668 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.669 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.669 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.674 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.675 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.681 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.681 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.684 INFO Fetch failed with 404: resource not found Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.684 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.685 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.685 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.690 INFO Fetch successful Sep 5 23:53:04.690510 coreos-metadata[1987]: Sep 05 23:53:04.690 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 5 23:53:04.674149 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 5 23:53:04.691343 extend-filesystems[1990]: Resized filesystem in /dev/nvme0n1p9 Sep 5 23:53:04.675351 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 5 23:53:04.694478 coreos-metadata[1987]: Sep 05 23:53:04.694 INFO Fetch successful Sep 5 23:53:04.694478 coreos-metadata[1987]: Sep 05 23:53:04.694 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 5 23:53:04.700201 coreos-metadata[1987]: Sep 05 23:53:04.696 INFO Fetch successful Sep 5 23:53:04.700201 coreos-metadata[1987]: Sep 05 23:53:04.697 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 5 23:53:04.700201 coreos-metadata[1987]: Sep 05 23:53:04.697 INFO Fetch successful Sep 5 23:53:04.720402 dbus-daemon[1988]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 5 23:53:04.720777 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 5 23:53:04.725725 dbus-daemon[1988]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2022 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 5 23:53:04.739273 systemd[1]: Starting polkit.service - Authorization Manager... Sep 5 23:53:04.785496 bash[2069]: Updated "/home/core/.ssh/authorized_keys" Sep 5 23:53:04.797418 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 5 23:53:04.814652 systemd[1]: Starting sshkeys.service... Sep 5 23:53:04.833290 polkitd[2068]: Started polkitd version 121 Sep 5 23:53:04.934468 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (1786) Sep 5 23:53:04.849718 polkitd[2068]: Loading rules from directory /etc/polkit-1/rules.d Sep 5 23:53:04.865433 systemd[1]: Started polkit.service - Authorization Manager. Sep 5 23:53:04.849822 polkitd[2068]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 5 23:53:04.852307 polkitd[2068]: Finished loading, compiling and executing 2 rules Sep 5 23:53:04.865083 dbus-daemon[1988]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 5 23:53:04.873263 polkitd[2068]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 5 23:53:04.937377 systemd-networkd[1930]: eth0: Gained IPv6LL Sep 5 23:53:04.951620 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 5 23:53:04.955142 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 5 23:53:04.968034 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 5 23:53:04.971734 systemd[1]: Reached target network-online.target - Network is Online. Sep 5 23:53:04.985423 systemd-hostnamed[2022]: Hostname set to (transient) Sep 5 23:53:04.985914 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 5 23:53:04.991848 systemd-resolved[1931]: System hostname changed to 'ip-172-31-18-129'. Sep 5 23:53:05.005685 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:05.014869 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 5 23:53:05.048333 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 5 23:53:05.063363 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 5 23:53:05.117880 amazon-ssm-agent[2112]: Initializing new seelog logger Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: New Seelog Logger Creation Complete Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 processing appconfig overrides Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO Proxy environment variables: Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 processing appconfig overrides Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 processing appconfig overrides Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 5 23:53:05.195363 amazon-ssm-agent[2112]: 2025/09/05 23:53:05 processing appconfig overrides Sep 5 23:53:05.222318 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO https_proxy: Sep 5 23:53:05.331261 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO http_proxy: Sep 5 23:53:05.338493 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 5 23:53:05.430885 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO no_proxy: Sep 5 23:53:05.437359 coreos-metadata[2127]: Sep 05 23:53:05.436 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 5 23:53:05.439187 coreos-metadata[2127]: Sep 05 23:53:05.438 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 5 23:53:05.441587 coreos-metadata[2127]: Sep 05 23:53:05.441 INFO Fetch successful Sep 5 23:53:05.441587 coreos-metadata[2127]: Sep 05 23:53:05.441 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 5 23:53:05.442084 coreos-metadata[2127]: Sep 05 23:53:05.441 INFO Fetch successful Sep 5 23:53:05.446432 unknown[2127]: wrote ssh authorized keys file for user: core Sep 5 23:53:05.538193 update-ssh-keys[2193]: Updated "/home/core/.ssh/authorized_keys" Sep 5 23:53:05.538423 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO Checking if agent identity type OnPrem can be assumed Sep 5 23:53:05.530055 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 5 23:53:05.553421 systemd[1]: Finished sshkeys.service. Sep 5 23:53:05.578387 locksmithd[2041]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 5 23:53:05.580019 containerd[2021]: time="2025-09-05T23:53:05.579876575Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 5 23:53:05.632554 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO Checking if agent identity type EC2 can be assumed Sep 5 23:53:05.733659 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO Agent will take identity from EC2 Sep 5 23:53:05.733778 containerd[2021]: time="2025-09-05T23:53:05.733556832Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.743754 containerd[2021]: time="2025-09-05T23:53:05.743480472Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.103-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:53:05.743754 containerd[2021]: time="2025-09-05T23:53:05.743549904Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 5 23:53:05.743754 containerd[2021]: time="2025-09-05T23:53:05.743596680Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 5 23:53:05.743971 containerd[2021]: time="2025-09-05T23:53:05.743895552Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 5 23:53:05.743971 containerd[2021]: time="2025-09-05T23:53:05.743929320Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.744261 containerd[2021]: time="2025-09-05T23:53:05.744044436Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:53:05.744261 containerd[2021]: time="2025-09-05T23:53:05.744086424Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.744878 containerd[2021]: time="2025-09-05T23:53:05.744783420Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:53:05.747221 containerd[2021]: time="2025-09-05T23:53:05.744841572Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.747307 containerd[2021]: time="2025-09-05T23:53:05.747235884Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:53:05.747307 containerd[2021]: time="2025-09-05T23:53:05.747273060Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.747529 containerd[2021]: time="2025-09-05T23:53:05.747485292Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.747957 containerd[2021]: time="2025-09-05T23:53:05.747908328Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:53:05.749212 containerd[2021]: time="2025-09-05T23:53:05.748137924Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:53:05.749212 containerd[2021]: time="2025-09-05T23:53:05.748775004Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 5 23:53:05.749212 containerd[2021]: time="2025-09-05T23:53:05.749003904Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 5 23:53:05.749212 containerd[2021]: time="2025-09-05T23:53:05.749106888Z" level=info msg="metadata content store policy set" policy=shared Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763290276Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763404348Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763441680Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763476444Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763511088Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.763786872Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764522952Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764729280Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764763684Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764806992Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764841312Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764872380Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764901696Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.765839 containerd[2021]: time="2025-09-05T23:53:05.764932764Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.764964372Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.765001512Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.765041112Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.765072216Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.765110952Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.766507 containerd[2021]: time="2025-09-05T23:53:05.765141792Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775300020Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775365204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775398036Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775430916Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775467852Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775501272Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775534632Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775579032Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775614996Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775645236Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775676544Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775721136Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775792236Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775821924Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.777348 containerd[2021]: time="2025-09-05T23:53:05.775849560Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 5 23:53:05.778065 containerd[2021]: time="2025-09-05T23:53:05.776255172Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779374548Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779434092Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779468064Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779493840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779527632Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779552412Z" level=info msg="NRI interface is disabled by configuration." Sep 5 23:53:05.781662 containerd[2021]: time="2025-09-05T23:53:05.779577312Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 5 23:53:05.783765 containerd[2021]: time="2025-09-05T23:53:05.783644160Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 5 23:53:05.788254 containerd[2021]: time="2025-09-05T23:53:05.786757644Z" level=info msg="Connect containerd service" Sep 5 23:53:05.788254 containerd[2021]: time="2025-09-05T23:53:05.786841764Z" level=info msg="using legacy CRI server" Sep 5 23:53:05.788254 containerd[2021]: time="2025-09-05T23:53:05.786860004Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 5 23:53:05.788254 containerd[2021]: time="2025-09-05T23:53:05.787038936Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792558468Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792729480Z" level=info msg="Start subscribing containerd event" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792806772Z" level=info msg="Start recovering state" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792924408Z" level=info msg="Start event monitor" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792948588Z" level=info msg="Start snapshots syncer" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792968904Z" level=info msg="Start cni network conf syncer for default" Sep 5 23:53:05.793212 containerd[2021]: time="2025-09-05T23:53:05.792986628Z" level=info msg="Start streaming server" Sep 5 23:53:05.801195 containerd[2021]: time="2025-09-05T23:53:05.797587044Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 5 23:53:05.801195 containerd[2021]: time="2025-09-05T23:53:05.797728452Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 5 23:53:05.797962 systemd[1]: Started containerd.service - containerd container runtime. Sep 5 23:53:05.812463 containerd[2021]: time="2025-09-05T23:53:05.812380069Z" level=info msg="containerd successfully booted in 0.242160s" Sep 5 23:53:05.829983 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 5 23:53:05.930219 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 5 23:53:06.031779 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 5 23:53:06.131178 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Sep 5 23:53:06.231341 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Sep 5 23:53:06.331553 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] Starting Core Agent Sep 5 23:53:06.431828 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [amazon-ssm-agent] registrar detected. Attempting registration Sep 5 23:53:06.519678 tar[2007]: linux-arm64/LICENSE Sep 5 23:53:06.522846 tar[2007]: linux-arm64/README.md Sep 5 23:53:06.535029 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [Registrar] Starting registrar module Sep 5 23:53:06.560231 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 5 23:53:06.635252 amazon-ssm-agent[2112]: 2025-09-05 23:53:05 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Sep 5 23:53:06.830429 sshd_keygen[2009]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 5 23:53:06.890885 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 5 23:53:06.907333 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 5 23:53:06.915657 systemd[1]: Started sshd@0-172.31.18.129:22-139.178.68.195:52312.service - OpenSSH per-connection server daemon (139.178.68.195:52312). Sep 5 23:53:06.944470 systemd[1]: issuegen.service: Deactivated successfully. Sep 5 23:53:06.945345 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 5 23:53:06.962780 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 5 23:53:07.007242 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 5 23:53:07.022729 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 5 23:53:07.033900 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 5 23:53:07.037783 systemd[1]: Reached target getty.target - Login Prompts. Sep 5 23:53:07.105107 amazon-ssm-agent[2112]: 2025-09-05 23:53:07 INFO [EC2Identity] EC2 registration was successful. Sep 5 23:53:07.138714 amazon-ssm-agent[2112]: 2025-09-05 23:53:07 INFO [CredentialRefresher] credentialRefresher has started Sep 5 23:53:07.139336 amazon-ssm-agent[2112]: 2025-09-05 23:53:07 INFO [CredentialRefresher] Starting credentials refresher loop Sep 5 23:53:07.139450 amazon-ssm-agent[2112]: 2025-09-05 23:53:07 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 5 23:53:07.206315 amazon-ssm-agent[2112]: 2025-09-05 23:53:07 INFO [CredentialRefresher] Next credential rotation will be in 31.1749769383 minutes Sep 5 23:53:07.212780 sshd[2227]: Accepted publickey for core from 139.178.68.195 port 52312 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:07.213685 sshd[2227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:07.230547 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 5 23:53:07.242651 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 5 23:53:07.249275 systemd-logind[1997]: New session 1 of user core. Sep 5 23:53:07.272940 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 5 23:53:07.289683 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 5 23:53:07.299473 (systemd)[2238]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 5 23:53:07.335533 ntpd[1992]: Listen normally on 6 eth0 [fe80::467:1dff:feb6:7c55%2]:123 Sep 5 23:53:07.336029 ntpd[1992]: 5 Sep 23:53:07 ntpd[1992]: Listen normally on 6 eth0 [fe80::467:1dff:feb6:7c55%2]:123 Sep 5 23:53:07.534529 systemd[2238]: Queued start job for default target default.target. Sep 5 23:53:07.547228 systemd[2238]: Created slice app.slice - User Application Slice. Sep 5 23:53:07.547288 systemd[2238]: Reached target paths.target - Paths. Sep 5 23:53:07.547321 systemd[2238]: Reached target timers.target - Timers. Sep 5 23:53:07.550114 systemd[2238]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 5 23:53:07.580732 systemd[2238]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 5 23:53:07.580989 systemd[2238]: Reached target sockets.target - Sockets. Sep 5 23:53:07.581037 systemd[2238]: Reached target basic.target - Basic System. Sep 5 23:53:07.581119 systemd[2238]: Reached target default.target - Main User Target. Sep 5 23:53:07.581224 systemd[2238]: Startup finished in 263ms. Sep 5 23:53:07.581517 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 5 23:53:07.592596 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 5 23:53:07.757389 systemd[1]: Started sshd@1-172.31.18.129:22-139.178.68.195:52320.service - OpenSSH per-connection server daemon (139.178.68.195:52320). Sep 5 23:53:07.937988 sshd[2249]: Accepted publickey for core from 139.178.68.195 port 52320 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:07.939651 sshd[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:07.948982 systemd-logind[1997]: New session 2 of user core. Sep 5 23:53:07.960428 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 5 23:53:08.088529 sshd[2249]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:08.096655 systemd-logind[1997]: Session 2 logged out. Waiting for processes to exit. Sep 5 23:53:08.097699 systemd[1]: sshd@1-172.31.18.129:22-139.178.68.195:52320.service: Deactivated successfully. Sep 5 23:53:08.101102 systemd[1]: session-2.scope: Deactivated successfully. Sep 5 23:53:08.102770 systemd-logind[1997]: Removed session 2. Sep 5 23:53:08.131881 systemd[1]: Started sshd@2-172.31.18.129:22-139.178.68.195:52328.service - OpenSSH per-connection server daemon (139.178.68.195:52328). Sep 5 23:53:08.186918 amazon-ssm-agent[2112]: 2025-09-05 23:53:08 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 5 23:53:08.290262 amazon-ssm-agent[2112]: 2025-09-05 23:53:08 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2259) started Sep 5 23:53:08.316256 sshd[2256]: Accepted publickey for core from 139.178.68.195 port 52328 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:08.318987 sshd[2256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:08.332846 systemd-logind[1997]: New session 3 of user core. Sep 5 23:53:08.337547 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 5 23:53:08.388536 amazon-ssm-agent[2112]: 2025-09-05 23:53:08 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 5 23:53:08.474533 sshd[2256]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:08.480359 systemd[1]: sshd@2-172.31.18.129:22-139.178.68.195:52328.service: Deactivated successfully. Sep 5 23:53:08.485375 systemd[1]: session-3.scope: Deactivated successfully. Sep 5 23:53:08.486942 systemd-logind[1997]: Session 3 logged out. Waiting for processes to exit. Sep 5 23:53:08.490906 systemd-logind[1997]: Removed session 3. Sep 5 23:53:09.619489 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:09.624677 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 5 23:53:09.625821 (kubelet)[2277]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:53:09.629822 systemd[1]: Startup finished in 1.177s (kernel) + 8.830s (initrd) + 11.034s (userspace) = 21.043s. Sep 5 23:53:11.675587 systemd-resolved[1931]: Clock change detected. Flushing caches. Sep 5 23:53:11.895376 kubelet[2277]: E0905 23:53:11.895255 2277 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:53:11.899775 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:53:11.900114 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:53:11.901701 systemd[1]: kubelet.service: Consumed 1.432s CPU time. Sep 5 23:53:18.855012 systemd[1]: Started sshd@3-172.31.18.129:22-139.178.68.195:54332.service - OpenSSH per-connection server daemon (139.178.68.195:54332). Sep 5 23:53:19.017053 sshd[2290]: Accepted publickey for core from 139.178.68.195 port 54332 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:19.019719 sshd[2290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:19.027108 systemd-logind[1997]: New session 4 of user core. Sep 5 23:53:19.038780 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 5 23:53:19.162138 sshd[2290]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:19.166946 systemd[1]: session-4.scope: Deactivated successfully. Sep 5 23:53:19.170250 systemd-logind[1997]: Session 4 logged out. Waiting for processes to exit. Sep 5 23:53:19.170614 systemd[1]: sshd@3-172.31.18.129:22-139.178.68.195:54332.service: Deactivated successfully. Sep 5 23:53:19.175409 systemd-logind[1997]: Removed session 4. Sep 5 23:53:19.196692 systemd[1]: Started sshd@4-172.31.18.129:22-139.178.68.195:54338.service - OpenSSH per-connection server daemon (139.178.68.195:54338). Sep 5 23:53:19.381326 sshd[2297]: Accepted publickey for core from 139.178.68.195 port 54338 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:19.383858 sshd[2297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:19.392871 systemd-logind[1997]: New session 5 of user core. Sep 5 23:53:19.402779 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 5 23:53:19.521741 sshd[2297]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:19.529261 systemd[1]: sshd@4-172.31.18.129:22-139.178.68.195:54338.service: Deactivated successfully. Sep 5 23:53:19.533440 systemd[1]: session-5.scope: Deactivated successfully. Sep 5 23:53:19.534895 systemd-logind[1997]: Session 5 logged out. Waiting for processes to exit. Sep 5 23:53:19.536632 systemd-logind[1997]: Removed session 5. Sep 5 23:53:19.561066 systemd[1]: Started sshd@5-172.31.18.129:22-139.178.68.195:54340.service - OpenSSH per-connection server daemon (139.178.68.195:54340). Sep 5 23:53:19.730218 sshd[2304]: Accepted publickey for core from 139.178.68.195 port 54340 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:19.732755 sshd[2304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:19.740130 systemd-logind[1997]: New session 6 of user core. Sep 5 23:53:19.752763 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 5 23:53:19.877050 sshd[2304]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:19.882950 systemd[1]: sshd@5-172.31.18.129:22-139.178.68.195:54340.service: Deactivated successfully. Sep 5 23:53:19.886417 systemd[1]: session-6.scope: Deactivated successfully. Sep 5 23:53:19.888734 systemd-logind[1997]: Session 6 logged out. Waiting for processes to exit. Sep 5 23:53:19.891881 systemd-logind[1997]: Removed session 6. Sep 5 23:53:19.917008 systemd[1]: Started sshd@6-172.31.18.129:22-139.178.68.195:54352.service - OpenSSH per-connection server daemon (139.178.68.195:54352). Sep 5 23:53:20.091617 sshd[2311]: Accepted publickey for core from 139.178.68.195 port 54352 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:20.094200 sshd[2311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:20.100995 systemd-logind[1997]: New session 7 of user core. Sep 5 23:53:20.109777 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 5 23:53:20.227233 sudo[2314]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 5 23:53:20.227895 sudo[2314]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:53:20.248415 sudo[2314]: pam_unix(sudo:session): session closed for user root Sep 5 23:53:20.271897 sshd[2311]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:20.278178 systemd-logind[1997]: Session 7 logged out. Waiting for processes to exit. Sep 5 23:53:20.279984 systemd[1]: sshd@6-172.31.18.129:22-139.178.68.195:54352.service: Deactivated successfully. Sep 5 23:53:20.283543 systemd[1]: session-7.scope: Deactivated successfully. Sep 5 23:53:20.285112 systemd-logind[1997]: Removed session 7. Sep 5 23:53:20.307902 systemd[1]: Started sshd@7-172.31.18.129:22-139.178.68.195:38794.service - OpenSSH per-connection server daemon (139.178.68.195:38794). Sep 5 23:53:20.494124 sshd[2319]: Accepted publickey for core from 139.178.68.195 port 38794 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:20.496201 sshd[2319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:20.504950 systemd-logind[1997]: New session 8 of user core. Sep 5 23:53:20.512769 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 5 23:53:20.617099 sudo[2323]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 5 23:53:20.617784 sudo[2323]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:53:20.623704 sudo[2323]: pam_unix(sudo:session): session closed for user root Sep 5 23:53:20.633589 sudo[2322]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 5 23:53:20.634197 sudo[2322]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:53:20.659010 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 5 23:53:20.662757 auditctl[2326]: No rules Sep 5 23:53:20.664295 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 23:53:20.665714 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 5 23:53:20.675155 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 5 23:53:20.717051 augenrules[2344]: No rules Sep 5 23:53:20.719663 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 5 23:53:20.723119 sudo[2322]: pam_unix(sudo:session): session closed for user root Sep 5 23:53:20.746637 sshd[2319]: pam_unix(sshd:session): session closed for user core Sep 5 23:53:20.752441 systemd[1]: sshd@7-172.31.18.129:22-139.178.68.195:38794.service: Deactivated successfully. Sep 5 23:53:20.755374 systemd[1]: session-8.scope: Deactivated successfully. Sep 5 23:53:20.758816 systemd-logind[1997]: Session 8 logged out. Waiting for processes to exit. Sep 5 23:53:20.760483 systemd-logind[1997]: Removed session 8. Sep 5 23:53:20.780672 systemd[1]: Started sshd@8-172.31.18.129:22-139.178.68.195:38810.service - OpenSSH per-connection server daemon (139.178.68.195:38810). Sep 5 23:53:20.962470 sshd[2352]: Accepted publickey for core from 139.178.68.195 port 38810 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:53:20.965081 sshd[2352]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:53:20.974688 systemd-logind[1997]: New session 9 of user core. Sep 5 23:53:20.984824 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 5 23:53:21.088657 sudo[2355]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 5 23:53:21.089327 sudo[2355]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:53:21.578971 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 5 23:53:21.592277 (dockerd)[2370]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 5 23:53:21.990576 dockerd[2370]: time="2025-09-05T23:53:21.990216331Z" level=info msg="Starting up" Sep 5 23:53:21.995789 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 5 23:53:22.005115 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:22.201076 dockerd[2370]: time="2025-09-05T23:53:22.201021364Z" level=info msg="Loading containers: start." Sep 5 23:53:22.412562 kernel: Initializing XFRM netlink socket Sep 5 23:53:22.454412 (udev-worker)[2396]: Network interface NamePolicy= disabled on kernel command line. Sep 5 23:53:22.508973 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:22.523154 (kubelet)[2458]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:53:22.571762 systemd-networkd[1930]: docker0: Link UP Sep 5 23:53:22.603473 kubelet[2458]: E0905 23:53:22.603338 2458 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:53:22.611343 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:53:22.611950 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:53:22.613428 dockerd[2370]: time="2025-09-05T23:53:22.611999478Z" level=info msg="Loading containers: done." Sep 5 23:53:22.649220 dockerd[2370]: time="2025-09-05T23:53:22.649162506Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 5 23:53:22.649603 dockerd[2370]: time="2025-09-05T23:53:22.649571058Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 5 23:53:22.649954 dockerd[2370]: time="2025-09-05T23:53:22.649919358Z" level=info msg="Daemon has completed initialization" Sep 5 23:53:22.716265 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 5 23:53:22.717175 dockerd[2370]: time="2025-09-05T23:53:22.716129430Z" level=info msg="API listen on /run/docker.sock" Sep 5 23:53:23.814187 containerd[2021]: time="2025-09-05T23:53:23.814113140Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 5 23:53:24.501487 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount352351043.mount: Deactivated successfully. Sep 5 23:53:25.874658 containerd[2021]: time="2025-09-05T23:53:25.873828298Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:25.876575 containerd[2021]: time="2025-09-05T23:53:25.876352330Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652441" Sep 5 23:53:25.879027 containerd[2021]: time="2025-09-05T23:53:25.878933734Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:25.885566 containerd[2021]: time="2025-09-05T23:53:25.885413518Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:25.888016 containerd[2021]: time="2025-09-05T23:53:25.887945962Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 2.07376357s" Sep 5 23:53:25.888586 containerd[2021]: time="2025-09-05T23:53:25.888231502Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 5 23:53:25.891980 containerd[2021]: time="2025-09-05T23:53:25.891917062Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 5 23:53:27.345558 containerd[2021]: time="2025-09-05T23:53:27.344206077Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:27.346972 containerd[2021]: time="2025-09-05T23:53:27.346903821Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460309" Sep 5 23:53:27.348051 containerd[2021]: time="2025-09-05T23:53:27.347932605Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:27.362611 containerd[2021]: time="2025-09-05T23:53:27.362141373Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:27.364434 containerd[2021]: time="2025-09-05T23:53:27.363840621Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.471661551s" Sep 5 23:53:27.364434 containerd[2021]: time="2025-09-05T23:53:27.363914373Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 5 23:53:27.364818 containerd[2021]: time="2025-09-05T23:53:27.364750125Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 5 23:53:28.679336 containerd[2021]: time="2025-09-05T23:53:28.679262856Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:28.681507 containerd[2021]: time="2025-09-05T23:53:28.681398208Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125903" Sep 5 23:53:28.682576 containerd[2021]: time="2025-09-05T23:53:28.682178016Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:28.688581 containerd[2021]: time="2025-09-05T23:53:28.688090404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:28.691029 containerd[2021]: time="2025-09-05T23:53:28.690956112Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.326129607s" Sep 5 23:53:28.691461 containerd[2021]: time="2025-09-05T23:53:28.691234464Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 5 23:53:28.692643 containerd[2021]: time="2025-09-05T23:53:28.692207004Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 5 23:53:30.115918 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1799130281.mount: Deactivated successfully. Sep 5 23:53:30.689866 containerd[2021]: time="2025-09-05T23:53:30.689806874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:30.692651 containerd[2021]: time="2025-09-05T23:53:30.691775462Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916095" Sep 5 23:53:30.692651 containerd[2021]: time="2025-09-05T23:53:30.691819082Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:30.695907 containerd[2021]: time="2025-09-05T23:53:30.695849990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:30.697911 containerd[2021]: time="2025-09-05T23:53:30.697563374Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 2.005255114s" Sep 5 23:53:30.697911 containerd[2021]: time="2025-09-05T23:53:30.697626242Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 5 23:53:30.698816 containerd[2021]: time="2025-09-05T23:53:30.698476946Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 5 23:53:31.228207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4221029936.mount: Deactivated successfully. Sep 5 23:53:32.477422 containerd[2021]: time="2025-09-05T23:53:32.477360855Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:32.480071 containerd[2021]: time="2025-09-05T23:53:32.480021867Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 5 23:53:32.482458 containerd[2021]: time="2025-09-05T23:53:32.482377335Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:32.489025 containerd[2021]: time="2025-09-05T23:53:32.488943771Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:32.492568 containerd[2021]: time="2025-09-05T23:53:32.491680995Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.793113977s" Sep 5 23:53:32.492568 containerd[2021]: time="2025-09-05T23:53:32.491752671Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 5 23:53:32.496846 containerd[2021]: time="2025-09-05T23:53:32.496488615Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 5 23:53:32.862289 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 5 23:53:32.878353 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:33.171964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3375506147.mount: Deactivated successfully. Sep 5 23:53:33.193556 containerd[2021]: time="2025-09-05T23:53:33.192030686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:33.197308 containerd[2021]: time="2025-09-05T23:53:33.197223986Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 5 23:53:33.198587 containerd[2021]: time="2025-09-05T23:53:33.198501830Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:33.208437 containerd[2021]: time="2025-09-05T23:53:33.208338626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:33.210295 containerd[2021]: time="2025-09-05T23:53:33.209957390Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 713.369739ms" Sep 5 23:53:33.210295 containerd[2021]: time="2025-09-05T23:53:33.210020174Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 5 23:53:33.212144 containerd[2021]: time="2025-09-05T23:53:33.211802498Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 5 23:53:33.257857 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:33.264103 (kubelet)[2661]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:53:33.347439 kubelet[2661]: E0905 23:53:33.347332 2661 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:53:33.352473 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:53:33.352907 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:53:33.837951 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1960192975.mount: Deactivated successfully. Sep 5 23:53:35.364689 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 5 23:53:36.168973 containerd[2021]: time="2025-09-05T23:53:36.168872933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:36.171671 containerd[2021]: time="2025-09-05T23:53:36.171603281Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537161" Sep 5 23:53:36.173808 containerd[2021]: time="2025-09-05T23:53:36.172585625Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:36.180126 containerd[2021]: time="2025-09-05T23:53:36.180046001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:53:36.183743 containerd[2021]: time="2025-09-05T23:53:36.183666905Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.971796115s" Sep 5 23:53:36.184038 containerd[2021]: time="2025-09-05T23:53:36.183981701Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 5 23:53:43.498591 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 5 23:53:43.508171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:43.865000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:43.881804 (kubelet)[2755]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:53:43.968934 kubelet[2755]: E0905 23:53:43.968847 2755 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:53:43.974644 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:53:43.975210 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:53:45.235715 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:45.251247 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:45.313869 systemd[1]: Reloading requested from client PID 2769 ('systemctl') (unit session-9.scope)... Sep 5 23:53:45.313910 systemd[1]: Reloading... Sep 5 23:53:45.544597 zram_generator::config[2818]: No configuration found. Sep 5 23:53:45.785763 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:53:45.972001 systemd[1]: Reloading finished in 657 ms. Sep 5 23:53:46.069267 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 5 23:53:46.069583 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 5 23:53:46.071683 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:46.080175 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:46.837644 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:46.857150 (kubelet)[2869]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 23:53:46.937085 kubelet[2869]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:53:46.937085 kubelet[2869]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 5 23:53:46.937085 kubelet[2869]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:53:46.937869 kubelet[2869]: I0905 23:53:46.937259 2869 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 23:53:47.648418 kubelet[2869]: I0905 23:53:47.648346 2869 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 5 23:53:47.648418 kubelet[2869]: I0905 23:53:47.648398 2869 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 23:53:47.648899 kubelet[2869]: I0905 23:53:47.648848 2869 server.go:934] "Client rotation is on, will bootstrap in background" Sep 5 23:53:47.702615 kubelet[2869]: E0905 23:53:47.702512 2869 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.18.129:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:47.704589 kubelet[2869]: I0905 23:53:47.703797 2869 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 23:53:47.717394 kubelet[2869]: E0905 23:53:47.717320 2869 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 5 23:53:47.717394 kubelet[2869]: I0905 23:53:47.717389 2869 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 5 23:53:47.724783 kubelet[2869]: I0905 23:53:47.724740 2869 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 23:53:47.725658 kubelet[2869]: I0905 23:53:47.725617 2869 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 5 23:53:47.726308 kubelet[2869]: I0905 23:53:47.726254 2869 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 23:53:47.727831 kubelet[2869]: I0905 23:53:47.726463 2869 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-18-129","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 23:53:47.727831 kubelet[2869]: I0905 23:53:47.726990 2869 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 23:53:47.727831 kubelet[2869]: I0905 23:53:47.727019 2869 container_manager_linux.go:300] "Creating device plugin manager" Sep 5 23:53:47.727831 kubelet[2869]: I0905 23:53:47.727395 2869 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:53:47.732821 kubelet[2869]: I0905 23:53:47.732767 2869 kubelet.go:408] "Attempting to sync node with API server" Sep 5 23:53:47.733049 kubelet[2869]: I0905 23:53:47.733024 2869 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 23:53:47.733198 kubelet[2869]: I0905 23:53:47.733176 2869 kubelet.go:314] "Adding apiserver pod source" Sep 5 23:53:47.733483 kubelet[2869]: I0905 23:53:47.733456 2869 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 23:53:47.742876 kubelet[2869]: W0905 23:53:47.742759 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.18.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-18-129&limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:47.743054 kubelet[2869]: E0905 23:53:47.742895 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.18.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-18-129&limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:47.744086 kubelet[2869]: W0905 23:53:47.743996 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.18.129:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:47.744265 kubelet[2869]: E0905 23:53:47.744107 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.18.129:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:47.744476 kubelet[2869]: I0905 23:53:47.744410 2869 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 5 23:53:47.746571 kubelet[2869]: I0905 23:53:47.745773 2869 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 23:53:47.746571 kubelet[2869]: W0905 23:53:47.746161 2869 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 5 23:53:47.749423 kubelet[2869]: I0905 23:53:47.749162 2869 server.go:1274] "Started kubelet" Sep 5 23:53:47.752716 kubelet[2869]: I0905 23:53:47.752659 2869 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 23:53:47.754931 kubelet[2869]: I0905 23:53:47.754887 2869 server.go:449] "Adding debug handlers to kubelet server" Sep 5 23:53:47.755115 kubelet[2869]: I0905 23:53:47.755027 2869 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 23:53:47.756559 kubelet[2869]: I0905 23:53:47.756448 2869 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 23:53:47.761798 kubelet[2869]: I0905 23:53:47.761725 2869 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 23:53:47.764591 kubelet[2869]: E0905 23:53:47.761558 2869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.18.129:6443/api/v1/namespaces/default/events\": dial tcp 172.31.18.129:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-18-129.18628814d8fd556b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-18-129,UID:ip-172-31-18-129,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-18-129,},FirstTimestamp:2025-09-05 23:53:47.749119339 +0000 UTC m=+0.885091686,LastTimestamp:2025-09-05 23:53:47.749119339 +0000 UTC m=+0.885091686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-18-129,}" Sep 5 23:53:47.766632 kubelet[2869]: I0905 23:53:47.766585 2869 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 23:53:47.771379 kubelet[2869]: I0905 23:53:47.771326 2869 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 5 23:53:47.774400 kubelet[2869]: E0905 23:53:47.773284 2869 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-18-129\" not found" Sep 5 23:53:47.775092 kubelet[2869]: I0905 23:53:47.775034 2869 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 5 23:53:47.775214 kubelet[2869]: I0905 23:53:47.775170 2869 reconciler.go:26] "Reconciler: start to sync state" Sep 5 23:53:47.777647 kubelet[2869]: W0905 23:53:47.777534 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.18.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:47.777794 kubelet[2869]: E0905 23:53:47.777664 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.18.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:47.777874 kubelet[2869]: E0905 23:53:47.777795 2869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": dial tcp 172.31.18.129:6443: connect: connection refused" interval="200ms" Sep 5 23:53:47.779958 kubelet[2869]: I0905 23:53:47.778634 2869 factory.go:221] Registration of the systemd container factory successfully Sep 5 23:53:47.779958 kubelet[2869]: I0905 23:53:47.779206 2869 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 23:53:47.782203 kubelet[2869]: I0905 23:53:47.782149 2869 factory.go:221] Registration of the containerd container factory successfully Sep 5 23:53:47.809626 kubelet[2869]: E0905 23:53:47.809572 2869 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 23:53:47.812259 kubelet[2869]: I0905 23:53:47.812210 2869 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 23:53:47.814598 kubelet[2869]: I0905 23:53:47.814559 2869 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 23:53:47.814871 kubelet[2869]: I0905 23:53:47.814849 2869 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 5 23:53:47.815432 kubelet[2869]: I0905 23:53:47.815005 2869 kubelet.go:2321] "Starting kubelet main sync loop" Sep 5 23:53:47.815432 kubelet[2869]: E0905 23:53:47.815083 2869 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 23:53:47.817940 kubelet[2869]: W0905 23:53:47.817416 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.18.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:47.817940 kubelet[2869]: E0905 23:53:47.817495 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.18.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:47.827508 kubelet[2869]: I0905 23:53:47.827472 2869 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 5 23:53:47.828096 kubelet[2869]: I0905 23:53:47.827716 2869 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 5 23:53:47.828096 kubelet[2869]: I0905 23:53:47.827754 2869 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:53:47.834611 kubelet[2869]: I0905 23:53:47.834399 2869 policy_none.go:49] "None policy: Start" Sep 5 23:53:47.836205 kubelet[2869]: I0905 23:53:47.836155 2869 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 5 23:53:47.836347 kubelet[2869]: I0905 23:53:47.836229 2869 state_mem.go:35] "Initializing new in-memory state store" Sep 5 23:53:47.848609 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 5 23:53:47.866456 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 5 23:53:47.874706 kubelet[2869]: E0905 23:53:47.873746 2869 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-18-129\" not found" Sep 5 23:53:47.874355 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 5 23:53:47.883130 kubelet[2869]: I0905 23:53:47.883075 2869 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 23:53:47.883413 kubelet[2869]: I0905 23:53:47.883372 2869 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 23:53:47.883593 kubelet[2869]: I0905 23:53:47.883402 2869 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 23:53:47.885325 kubelet[2869]: I0905 23:53:47.884864 2869 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 23:53:47.887460 kubelet[2869]: E0905 23:53:47.887380 2869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-18-129\" not found" Sep 5 23:53:47.935278 systemd[1]: Created slice kubepods-burstable-pod7964b431c5313d3ae842a4ca6ecf102a.slice - libcontainer container kubepods-burstable-pod7964b431c5313d3ae842a4ca6ecf102a.slice. Sep 5 23:53:47.951839 systemd[1]: Created slice kubepods-burstable-podc1dbdd00ed7fdddcc76fb0096da8b484.slice - libcontainer container kubepods-burstable-podc1dbdd00ed7fdddcc76fb0096da8b484.slice. Sep 5 23:53:47.974207 systemd[1]: Created slice kubepods-burstable-podac663395d1a7f1f73b7bbd30200fcd83.slice - libcontainer container kubepods-burstable-podac663395d1a7f1f73b7bbd30200fcd83.slice. Sep 5 23:53:47.978784 kubelet[2869]: E0905 23:53:47.978702 2869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": dial tcp 172.31.18.129:6443: connect: connection refused" interval="400ms" Sep 5 23:53:47.986562 kubelet[2869]: I0905 23:53:47.985749 2869 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:47.986562 kubelet[2869]: E0905 23:53:47.986511 2869 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.18.129:6443/api/v1/nodes\": dial tcp 172.31.18.129:6443: connect: connection refused" node="ip-172-31-18-129" Sep 5 23:53:48.075814 kubelet[2869]: I0905 23:53:48.075721 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-ca-certs\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:48.075814 kubelet[2869]: I0905 23:53:48.075784 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-kubeconfig\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:48.076032 kubelet[2869]: I0905 23:53:48.075824 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:48.076032 kubelet[2869]: I0905 23:53:48.075866 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1dbdd00ed7fdddcc76fb0096da8b484-kubeconfig\") pod \"kube-scheduler-ip-172-31-18-129\" (UID: \"c1dbdd00ed7fdddcc76fb0096da8b484\") " pod="kube-system/kube-scheduler-ip-172-31-18-129" Sep 5 23:53:48.076032 kubelet[2869]: I0905 23:53:48.075902 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-ca-certs\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:48.076032 kubelet[2869]: I0905 23:53:48.075938 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-k8s-certs\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:48.076032 kubelet[2869]: I0905 23:53:48.075975 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:48.076280 kubelet[2869]: I0905 23:53:48.076010 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:48.076280 kubelet[2869]: I0905 23:53:48.076044 2869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-k8s-certs\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:48.189074 kubelet[2869]: I0905 23:53:48.188927 2869 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:48.189580 kubelet[2869]: E0905 23:53:48.189419 2869 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.18.129:6443/api/v1/nodes\": dial tcp 172.31.18.129:6443: connect: connection refused" node="ip-172-31-18-129" Sep 5 23:53:48.249396 containerd[2021]: time="2025-09-05T23:53:48.249318017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-18-129,Uid:7964b431c5313d3ae842a4ca6ecf102a,Namespace:kube-system,Attempt:0,}" Sep 5 23:53:48.269202 containerd[2021]: time="2025-09-05T23:53:48.268836809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-18-129,Uid:c1dbdd00ed7fdddcc76fb0096da8b484,Namespace:kube-system,Attempt:0,}" Sep 5 23:53:48.281796 containerd[2021]: time="2025-09-05T23:53:48.281728169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-18-129,Uid:ac663395d1a7f1f73b7bbd30200fcd83,Namespace:kube-system,Attempt:0,}" Sep 5 23:53:48.379315 kubelet[2869]: E0905 23:53:48.379262 2869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": dial tcp 172.31.18.129:6443: connect: connection refused" interval="800ms" Sep 5 23:53:48.592676 kubelet[2869]: I0905 23:53:48.592619 2869 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:48.593574 kubelet[2869]: E0905 23:53:48.593498 2869 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.18.129:6443/api/v1/nodes\": dial tcp 172.31.18.129:6443: connect: connection refused" node="ip-172-31-18-129" Sep 5 23:53:48.748002 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3355634914.mount: Deactivated successfully. Sep 5 23:53:48.762220 containerd[2021]: time="2025-09-05T23:53:48.762138284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:53:48.764509 containerd[2021]: time="2025-09-05T23:53:48.764438540Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:53:48.766423 containerd[2021]: time="2025-09-05T23:53:48.766326296Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Sep 5 23:53:48.768394 containerd[2021]: time="2025-09-05T23:53:48.768343604Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 5 23:53:48.770601 containerd[2021]: time="2025-09-05T23:53:48.770504396Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:53:48.773625 containerd[2021]: time="2025-09-05T23:53:48.773277752Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:53:48.775096 containerd[2021]: time="2025-09-05T23:53:48.774964184Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 5 23:53:48.779472 containerd[2021]: time="2025-09-05T23:53:48.779380940Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:53:48.783831 containerd[2021]: time="2025-09-05T23:53:48.783501596Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 514.553583ms" Sep 5 23:53:48.788691 containerd[2021]: time="2025-09-05T23:53:48.788607140Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 539.155131ms" Sep 5 23:53:48.790219 containerd[2021]: time="2025-09-05T23:53:48.789965804Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 508.117251ms" Sep 5 23:53:48.816506 kubelet[2869]: W0905 23:53:48.816373 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.18.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:48.816506 kubelet[2869]: E0905 23:53:48.816460 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.18.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:48.823820 kubelet[2869]: W0905 23:53:48.820350 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.18.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-18-129&limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:48.823820 kubelet[2869]: E0905 23:53:48.820448 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.18.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-18-129&limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:48.930381 kubelet[2869]: W0905 23:53:48.930144 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.18.129:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:48.930381 kubelet[2869]: E0905 23:53:48.930253 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.18.129:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:49.021824 containerd[2021]: time="2025-09-05T23:53:49.021663017Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:53:49.022508 containerd[2021]: time="2025-09-05T23:53:49.022406417Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:53:49.022865 containerd[2021]: time="2025-09-05T23:53:49.022712489Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.027327 containerd[2021]: time="2025-09-05T23:53:49.026766653Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.032508 containerd[2021]: time="2025-09-05T23:53:49.031705613Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:53:49.032508 containerd[2021]: time="2025-09-05T23:53:49.031813565Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:53:49.032508 containerd[2021]: time="2025-09-05T23:53:49.031842917Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.032508 containerd[2021]: time="2025-09-05T23:53:49.032005877Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.036932 containerd[2021]: time="2025-09-05T23:53:49.035132765Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:53:49.036932 containerd[2021]: time="2025-09-05T23:53:49.035229977Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:53:49.036932 containerd[2021]: time="2025-09-05T23:53:49.035268257Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.036932 containerd[2021]: time="2025-09-05T23:53:49.035430365Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:53:49.090988 systemd[1]: Started cri-containerd-418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d.scope - libcontainer container 418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d. Sep 5 23:53:49.095473 systemd[1]: Started cri-containerd-5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584.scope - libcontainer container 5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584. Sep 5 23:53:49.114901 systemd[1]: Started cri-containerd-164b4b0e697789abcdd4089b361c88e5fd820f822c802492083c2cf14a3756a5.scope - libcontainer container 164b4b0e697789abcdd4089b361c88e5fd820f822c802492083c2cf14a3756a5. Sep 5 23:53:49.182218 kubelet[2869]: E0905 23:53:49.182055 2869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": dial tcp 172.31.18.129:6443: connect: connection refused" interval="1.6s" Sep 5 23:53:49.208054 containerd[2021]: time="2025-09-05T23:53:49.207746406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-18-129,Uid:c1dbdd00ed7fdddcc76fb0096da8b484,Namespace:kube-system,Attempt:0,} returns sandbox id \"5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584\"" Sep 5 23:53:49.224556 containerd[2021]: time="2025-09-05T23:53:49.223661910Z" level=info msg="CreateContainer within sandbox \"5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 5 23:53:49.228608 containerd[2021]: time="2025-09-05T23:53:49.228492894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-18-129,Uid:ac663395d1a7f1f73b7bbd30200fcd83,Namespace:kube-system,Attempt:0,} returns sandbox id \"164b4b0e697789abcdd4089b361c88e5fd820f822c802492083c2cf14a3756a5\"" Sep 5 23:53:49.253540 kubelet[2869]: W0905 23:53:49.253423 2869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.18.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.18.129:6443: connect: connection refused Sep 5 23:53:49.253767 kubelet[2869]: E0905 23:53:49.253735 2869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.18.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.18.129:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:53:49.254948 containerd[2021]: time="2025-09-05T23:53:49.254796606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-18-129,Uid:7964b431c5313d3ae842a4ca6ecf102a,Namespace:kube-system,Attempt:0,} returns sandbox id \"418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d\"" Sep 5 23:53:49.257537 containerd[2021]: time="2025-09-05T23:53:49.255957234Z" level=info msg="CreateContainer within sandbox \"164b4b0e697789abcdd4089b361c88e5fd820f822c802492083c2cf14a3756a5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 5 23:53:49.269741 containerd[2021]: time="2025-09-05T23:53:49.269666970Z" level=info msg="CreateContainer within sandbox \"418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 5 23:53:49.297231 containerd[2021]: time="2025-09-05T23:53:49.297150630Z" level=info msg="CreateContainer within sandbox \"5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7\"" Sep 5 23:53:49.298155 containerd[2021]: time="2025-09-05T23:53:49.298086750Z" level=info msg="StartContainer for \"f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7\"" Sep 5 23:53:49.320819 containerd[2021]: time="2025-09-05T23:53:49.320548362Z" level=info msg="CreateContainer within sandbox \"164b4b0e697789abcdd4089b361c88e5fd820f822c802492083c2cf14a3756a5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ba771ee0d4d66761dd1890d671d2a9345477126a558165fa3599114ec1a0122c\"" Sep 5 23:53:49.321850 containerd[2021]: time="2025-09-05T23:53:49.321795294Z" level=info msg="StartContainer for \"ba771ee0d4d66761dd1890d671d2a9345477126a558165fa3599114ec1a0122c\"" Sep 5 23:53:49.326443 containerd[2021]: time="2025-09-05T23:53:49.326386087Z" level=info msg="CreateContainer within sandbox \"418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84\"" Sep 5 23:53:49.328419 containerd[2021]: time="2025-09-05T23:53:49.327804043Z" level=info msg="StartContainer for \"cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84\"" Sep 5 23:53:49.351156 systemd[1]: Started cri-containerd-f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7.scope - libcontainer container f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7. Sep 5 23:53:49.399950 kubelet[2869]: I0905 23:53:49.399850 2869 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:49.401562 kubelet[2869]: E0905 23:53:49.401433 2869 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.18.129:6443/api/v1/nodes\": dial tcp 172.31.18.129:6443: connect: connection refused" node="ip-172-31-18-129" Sep 5 23:53:49.412875 systemd[1]: Started cri-containerd-cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84.scope - libcontainer container cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84. Sep 5 23:53:49.428920 systemd[1]: Started cri-containerd-ba771ee0d4d66761dd1890d671d2a9345477126a558165fa3599114ec1a0122c.scope - libcontainer container ba771ee0d4d66761dd1890d671d2a9345477126a558165fa3599114ec1a0122c. Sep 5 23:53:49.502291 containerd[2021]: time="2025-09-05T23:53:49.502089319Z" level=info msg="StartContainer for \"f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7\" returns successfully" Sep 5 23:53:49.536984 containerd[2021]: time="2025-09-05T23:53:49.536817800Z" level=info msg="StartContainer for \"ba771ee0d4d66761dd1890d671d2a9345477126a558165fa3599114ec1a0122c\" returns successfully" Sep 5 23:53:49.592958 containerd[2021]: time="2025-09-05T23:53:49.592779476Z" level=info msg="StartContainer for \"cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84\" returns successfully" Sep 5 23:53:49.930302 update_engine[1998]: I20250905 23:53:49.927559 1998 update_attempter.cc:509] Updating boot flags... Sep 5 23:53:50.036591 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (3152) Sep 5 23:53:51.006480 kubelet[2869]: I0905 23:53:51.004017 2869 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:53.309834 kubelet[2869]: E0905 23:53:53.309773 2869 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-18-129\" not found" node="ip-172-31-18-129" Sep 5 23:53:53.355073 kubelet[2869]: I0905 23:53:53.354991 2869 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-18-129" Sep 5 23:53:53.747362 kubelet[2869]: I0905 23:53:53.747206 2869 apiserver.go:52] "Watching apiserver" Sep 5 23:53:53.775717 kubelet[2869]: I0905 23:53:53.775660 2869 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 5 23:53:55.340072 systemd[1]: Reloading requested from client PID 3239 ('systemctl') (unit session-9.scope)... Sep 5 23:53:55.340099 systemd[1]: Reloading... Sep 5 23:53:55.519588 zram_generator::config[3280]: No configuration found. Sep 5 23:53:55.765275 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:53:55.972724 systemd[1]: Reloading finished in 631 ms. Sep 5 23:53:56.052752 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:56.064453 systemd[1]: kubelet.service: Deactivated successfully. Sep 5 23:53:56.065163 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:56.065375 systemd[1]: kubelet.service: Consumed 1.646s CPU time, 129.2M memory peak, 0B memory swap peak. Sep 5 23:53:56.079757 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:53:56.407817 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:53:56.411923 (kubelet)[3339]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 23:53:56.488453 kubelet[3339]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:53:56.488453 kubelet[3339]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 5 23:53:56.488453 kubelet[3339]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:53:56.489055 kubelet[3339]: I0905 23:53:56.488950 3339 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 23:53:56.507241 kubelet[3339]: I0905 23:53:56.507172 3339 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 5 23:53:56.507241 kubelet[3339]: I0905 23:53:56.507230 3339 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 23:53:56.509565 kubelet[3339]: I0905 23:53:56.509389 3339 server.go:934] "Client rotation is on, will bootstrap in background" Sep 5 23:53:56.514546 kubelet[3339]: I0905 23:53:56.514106 3339 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 5 23:53:56.520621 kubelet[3339]: I0905 23:53:56.520567 3339 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 23:53:56.530572 kubelet[3339]: E0905 23:53:56.529746 3339 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 5 23:53:56.530572 kubelet[3339]: I0905 23:53:56.529808 3339 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 5 23:53:56.536471 kubelet[3339]: I0905 23:53:56.536420 3339 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 23:53:56.536904 kubelet[3339]: I0905 23:53:56.536680 3339 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 5 23:53:56.537001 kubelet[3339]: I0905 23:53:56.536906 3339 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 23:53:56.537568 kubelet[3339]: I0905 23:53:56.536954 3339 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-18-129","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 23:53:56.537568 kubelet[3339]: I0905 23:53:56.537249 3339 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 23:53:56.537568 kubelet[3339]: I0905 23:53:56.537270 3339 container_manager_linux.go:300] "Creating device plugin manager" Sep 5 23:53:56.537568 kubelet[3339]: I0905 23:53:56.537358 3339 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:53:56.539131 kubelet[3339]: I0905 23:53:56.537848 3339 kubelet.go:408] "Attempting to sync node with API server" Sep 5 23:53:56.539131 kubelet[3339]: I0905 23:53:56.537882 3339 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 23:53:56.539131 kubelet[3339]: I0905 23:53:56.537914 3339 kubelet.go:314] "Adding apiserver pod source" Sep 5 23:53:56.539131 kubelet[3339]: I0905 23:53:56.537953 3339 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 23:53:56.541560 kubelet[3339]: I0905 23:53:56.540806 3339 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 5 23:53:56.544609 kubelet[3339]: I0905 23:53:56.542484 3339 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 23:53:56.544609 kubelet[3339]: I0905 23:53:56.544364 3339 server.go:1274] "Started kubelet" Sep 5 23:53:56.552335 kubelet[3339]: I0905 23:53:56.552296 3339 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 23:53:56.561687 kubelet[3339]: I0905 23:53:56.561619 3339 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 23:53:56.567273 kubelet[3339]: I0905 23:53:56.567232 3339 server.go:449] "Adding debug handlers to kubelet server" Sep 5 23:53:56.578805 kubelet[3339]: I0905 23:53:56.576872 3339 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 23:53:56.579357 kubelet[3339]: I0905 23:53:56.579327 3339 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 23:53:56.580005 kubelet[3339]: I0905 23:53:56.579967 3339 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 23:53:56.587639 kubelet[3339]: I0905 23:53:56.585886 3339 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 5 23:53:56.591401 kubelet[3339]: E0905 23:53:56.591100 3339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-18-129\" not found" Sep 5 23:53:56.593389 kubelet[3339]: I0905 23:53:56.593346 3339 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 5 23:53:56.593964 kubelet[3339]: I0905 23:53:56.593935 3339 reconciler.go:26] "Reconciler: start to sync state" Sep 5 23:53:56.621251 kubelet[3339]: I0905 23:53:56.620770 3339 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 23:53:56.643272 kubelet[3339]: I0905 23:53:56.643189 3339 factory.go:221] Registration of the containerd container factory successfully Sep 5 23:53:56.643272 kubelet[3339]: I0905 23:53:56.643228 3339 factory.go:221] Registration of the systemd container factory successfully Sep 5 23:53:56.646768 kubelet[3339]: I0905 23:53:56.645204 3339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 23:53:56.652260 kubelet[3339]: I0905 23:53:56.652216 3339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 23:53:56.653017 kubelet[3339]: I0905 23:53:56.652471 3339 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 5 23:53:56.653017 kubelet[3339]: I0905 23:53:56.652544 3339 kubelet.go:2321] "Starting kubelet main sync loop" Sep 5 23:53:56.653017 kubelet[3339]: E0905 23:53:56.652616 3339 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 23:53:56.682111 kubelet[3339]: E0905 23:53:56.681971 3339 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 23:53:56.754616 kubelet[3339]: E0905 23:53:56.754502 3339 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 5 23:53:56.794460 kubelet[3339]: I0905 23:53:56.794423 3339 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 5 23:53:56.794711 kubelet[3339]: I0905 23:53:56.794679 3339 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 5 23:53:56.794840 kubelet[3339]: I0905 23:53:56.794820 3339 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:53:56.795894 kubelet[3339]: I0905 23:53:56.795683 3339 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 5 23:53:56.795894 kubelet[3339]: I0905 23:53:56.795711 3339 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 5 23:53:56.795894 kubelet[3339]: I0905 23:53:56.795745 3339 policy_none.go:49] "None policy: Start" Sep 5 23:53:56.799554 kubelet[3339]: I0905 23:53:56.799068 3339 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 5 23:53:56.799554 kubelet[3339]: I0905 23:53:56.799115 3339 state_mem.go:35] "Initializing new in-memory state store" Sep 5 23:53:56.799554 kubelet[3339]: I0905 23:53:56.799364 3339 state_mem.go:75] "Updated machine memory state" Sep 5 23:53:56.812247 kubelet[3339]: I0905 23:53:56.812191 3339 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 23:53:56.813840 kubelet[3339]: I0905 23:53:56.813422 3339 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 23:53:56.815649 kubelet[3339]: I0905 23:53:56.814109 3339 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 23:53:56.816569 kubelet[3339]: I0905 23:53:56.816306 3339 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 23:53:56.933697 kubelet[3339]: I0905 23:53:56.932915 3339 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-18-129" Sep 5 23:53:56.974560 kubelet[3339]: I0905 23:53:56.974343 3339 kubelet_node_status.go:111] "Node was previously registered" node="ip-172-31-18-129" Sep 5 23:53:56.974560 kubelet[3339]: I0905 23:53:56.974453 3339 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-18-129" Sep 5 23:53:56.995434 kubelet[3339]: I0905 23:53:56.994926 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1dbdd00ed7fdddcc76fb0096da8b484-kubeconfig\") pod \"kube-scheduler-ip-172-31-18-129\" (UID: \"c1dbdd00ed7fdddcc76fb0096da8b484\") " pod="kube-system/kube-scheduler-ip-172-31-18-129" Sep 5 23:53:56.995434 kubelet[3339]: I0905 23:53:56.994988 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-ca-certs\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:56.995434 kubelet[3339]: I0905 23:53:56.995029 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-k8s-certs\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:56.995434 kubelet[3339]: I0905 23:53:56.995068 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ac663395d1a7f1f73b7bbd30200fcd83-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-18-129\" (UID: \"ac663395d1a7f1f73b7bbd30200fcd83\") " pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:56.995434 kubelet[3339]: I0905 23:53:56.995111 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-ca-certs\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:56.995848 kubelet[3339]: I0905 23:53:56.995148 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-k8s-certs\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:56.995848 kubelet[3339]: I0905 23:53:56.995184 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:56.995848 kubelet[3339]: I0905 23:53:56.995220 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-kubeconfig\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:56.995848 kubelet[3339]: I0905 23:53:56.995262 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7964b431c5313d3ae842a4ca6ecf102a-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-18-129\" (UID: \"7964b431c5313d3ae842a4ca6ecf102a\") " pod="kube-system/kube-controller-manager-ip-172-31-18-129" Sep 5 23:53:57.539202 kubelet[3339]: I0905 23:53:57.538855 3339 apiserver.go:52] "Watching apiserver" Sep 5 23:53:57.594214 kubelet[3339]: I0905 23:53:57.594128 3339 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 5 23:53:57.740948 kubelet[3339]: E0905 23:53:57.740643 3339 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-18-129\" already exists" pod="kube-system/kube-apiserver-ip-172-31-18-129" Sep 5 23:53:57.807168 kubelet[3339]: I0905 23:53:57.806789 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-18-129" podStartSLOduration=1.806765585 podStartE2EDuration="1.806765585s" podCreationTimestamp="2025-09-05 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:53:57.786873677 +0000 UTC m=+1.367936336" watchObservedRunningTime="2025-09-05 23:53:57.806765585 +0000 UTC m=+1.387828208" Sep 5 23:53:57.828565 kubelet[3339]: I0905 23:53:57.826617 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-18-129" podStartSLOduration=1.826595609 podStartE2EDuration="1.826595609s" podCreationTimestamp="2025-09-05 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:53:57.807198029 +0000 UTC m=+1.388260664" watchObservedRunningTime="2025-09-05 23:53:57.826595609 +0000 UTC m=+1.407658244" Sep 5 23:53:57.845154 kubelet[3339]: I0905 23:53:57.845080 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-18-129" podStartSLOduration=1.845056961 podStartE2EDuration="1.845056961s" podCreationTimestamp="2025-09-05 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:53:57.827053685 +0000 UTC m=+1.408116332" watchObservedRunningTime="2025-09-05 23:53:57.845056961 +0000 UTC m=+1.426119584" Sep 5 23:54:01.608195 kubelet[3339]: I0905 23:54:01.607909 3339 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 5 23:54:01.610658 kubelet[3339]: I0905 23:54:01.609411 3339 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 5 23:54:01.610770 containerd[2021]: time="2025-09-05T23:54:01.609099920Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 5 23:54:02.607843 systemd[1]: Created slice kubepods-besteffort-podff66f6c3_ec1b_4ef6_bd85_7446d7b8a3c7.slice - libcontainer container kubepods-besteffort-podff66f6c3_ec1b_4ef6_bd85_7446d7b8a3c7.slice. Sep 5 23:54:02.637216 kubelet[3339]: I0905 23:54:02.637148 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7-lib-modules\") pod \"kube-proxy-hpwdt\" (UID: \"ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7\") " pod="kube-system/kube-proxy-hpwdt" Sep 5 23:54:02.637216 kubelet[3339]: I0905 23:54:02.637218 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnsn\" (UniqueName: \"kubernetes.io/projected/ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7-kube-api-access-mlnsn\") pod \"kube-proxy-hpwdt\" (UID: \"ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7\") " pod="kube-system/kube-proxy-hpwdt" Sep 5 23:54:02.637882 kubelet[3339]: I0905 23:54:02.637281 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7-kube-proxy\") pod \"kube-proxy-hpwdt\" (UID: \"ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7\") " pod="kube-system/kube-proxy-hpwdt" Sep 5 23:54:02.637882 kubelet[3339]: I0905 23:54:02.637320 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7-xtables-lock\") pod \"kube-proxy-hpwdt\" (UID: \"ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7\") " pod="kube-system/kube-proxy-hpwdt" Sep 5 23:54:02.793942 systemd[1]: Created slice kubepods-besteffort-pod6d1cfcad_ca17_4937_a600_96cb0524911d.slice - libcontainer container kubepods-besteffort-pod6d1cfcad_ca17_4937_a600_96cb0524911d.slice. Sep 5 23:54:02.839120 kubelet[3339]: I0905 23:54:02.839040 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6d1cfcad-ca17-4937-a600-96cb0524911d-var-lib-calico\") pod \"tigera-operator-58fc44c59b-bszf7\" (UID: \"6d1cfcad-ca17-4937-a600-96cb0524911d\") " pod="tigera-operator/tigera-operator-58fc44c59b-bszf7" Sep 5 23:54:02.839120 kubelet[3339]: I0905 23:54:02.839117 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2db2r\" (UniqueName: \"kubernetes.io/projected/6d1cfcad-ca17-4937-a600-96cb0524911d-kube-api-access-2db2r\") pod \"tigera-operator-58fc44c59b-bszf7\" (UID: \"6d1cfcad-ca17-4937-a600-96cb0524911d\") " pod="tigera-operator/tigera-operator-58fc44c59b-bszf7" Sep 5 23:54:02.920875 containerd[2021]: time="2025-09-05T23:54:02.920076586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hpwdt,Uid:ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7,Namespace:kube-system,Attempt:0,}" Sep 5 23:54:02.984748 containerd[2021]: time="2025-09-05T23:54:02.984545446Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:02.984748 containerd[2021]: time="2025-09-05T23:54:02.984693334Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:02.985069 containerd[2021]: time="2025-09-05T23:54:02.984979186Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:02.985386 containerd[2021]: time="2025-09-05T23:54:02.985301098Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:03.034895 systemd[1]: Started cri-containerd-d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3.scope - libcontainer container d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3. Sep 5 23:54:03.074457 containerd[2021]: time="2025-09-05T23:54:03.074282455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hpwdt,Uid:ff66f6c3-ec1b-4ef6-bd85-7446d7b8a3c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3\"" Sep 5 23:54:03.080773 containerd[2021]: time="2025-09-05T23:54:03.080554639Z" level=info msg="CreateContainer within sandbox \"d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 5 23:54:03.101757 containerd[2021]: time="2025-09-05T23:54:03.101700295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bszf7,Uid:6d1cfcad-ca17-4937-a600-96cb0524911d,Namespace:tigera-operator,Attempt:0,}" Sep 5 23:54:03.116911 containerd[2021]: time="2025-09-05T23:54:03.116825671Z" level=info msg="CreateContainer within sandbox \"d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"02a62c8fa65226f167acc27792cc23fbffbf9b7f3e3296f2944e8a0bcebdb750\"" Sep 5 23:54:03.119869 containerd[2021]: time="2025-09-05T23:54:03.119722711Z" level=info msg="StartContainer for \"02a62c8fa65226f167acc27792cc23fbffbf9b7f3e3296f2944e8a0bcebdb750\"" Sep 5 23:54:03.180723 containerd[2021]: time="2025-09-05T23:54:03.179315155Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:03.180723 containerd[2021]: time="2025-09-05T23:54:03.180483343Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:03.180723 containerd[2021]: time="2025-09-05T23:54:03.180513079Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:03.181416 systemd[1]: Started cri-containerd-02a62c8fa65226f167acc27792cc23fbffbf9b7f3e3296f2944e8a0bcebdb750.scope - libcontainer container 02a62c8fa65226f167acc27792cc23fbffbf9b7f3e3296f2944e8a0bcebdb750. Sep 5 23:54:03.183876 containerd[2021]: time="2025-09-05T23:54:03.182157427Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:03.221189 systemd[1]: Started cri-containerd-e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73.scope - libcontainer container e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73. Sep 5 23:54:03.273010 containerd[2021]: time="2025-09-05T23:54:03.272935928Z" level=info msg="StartContainer for \"02a62c8fa65226f167acc27792cc23fbffbf9b7f3e3296f2944e8a0bcebdb750\" returns successfully" Sep 5 23:54:03.318994 containerd[2021]: time="2025-09-05T23:54:03.318868976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-bszf7,Uid:6d1cfcad-ca17-4937-a600-96cb0524911d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73\"" Sep 5 23:54:03.324555 containerd[2021]: time="2025-09-05T23:54:03.323995484Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 5 23:54:03.772152 kubelet[3339]: I0905 23:54:03.771365 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hpwdt" podStartSLOduration=1.771340102 podStartE2EDuration="1.771340102s" podCreationTimestamp="2025-09-05 23:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:54:03.770987074 +0000 UTC m=+7.352049709" watchObservedRunningTime="2025-09-05 23:54:03.771340102 +0000 UTC m=+7.352402761" Sep 5 23:54:03.788743 systemd[1]: run-containerd-runc-k8s.io-d40a9652941c9279336e22c6e567123147f6f4a1e3707d7135b564373c8da5b3-runc.StCnXM.mount: Deactivated successfully. Sep 5 23:54:04.778312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2371959684.mount: Deactivated successfully. Sep 5 23:54:05.504650 containerd[2021]: time="2025-09-05T23:54:05.503950895Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:05.506380 containerd[2021]: time="2025-09-05T23:54:05.506328371Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 5 23:54:05.508612 containerd[2021]: time="2025-09-05T23:54:05.508076219Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:05.513585 containerd[2021]: time="2025-09-05T23:54:05.513488891Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:05.515204 containerd[2021]: time="2025-09-05T23:54:05.515149907Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.191088795s" Sep 5 23:54:05.515387 containerd[2021]: time="2025-09-05T23:54:05.515340011Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 5 23:54:05.522627 containerd[2021]: time="2025-09-05T23:54:05.522566555Z" level=info msg="CreateContainer within sandbox \"e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 5 23:54:05.548914 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2705480896.mount: Deactivated successfully. Sep 5 23:54:05.553292 containerd[2021]: time="2025-09-05T23:54:05.552972215Z" level=info msg="CreateContainer within sandbox \"e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15\"" Sep 5 23:54:05.555144 containerd[2021]: time="2025-09-05T23:54:05.555097331Z" level=info msg="StartContainer for \"5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15\"" Sep 5 23:54:05.609832 systemd[1]: Started cri-containerd-5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15.scope - libcontainer container 5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15. Sep 5 23:54:05.656179 containerd[2021]: time="2025-09-05T23:54:05.656103384Z" level=info msg="StartContainer for \"5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15\" returns successfully" Sep 5 23:54:06.673084 kubelet[3339]: I0905 23:54:06.672986 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-bszf7" podStartSLOduration=2.478376818 podStartE2EDuration="4.672965665s" podCreationTimestamp="2025-09-05 23:54:02 +0000 UTC" firstStartedPulling="2025-09-05 23:54:03.322575548 +0000 UTC m=+6.903638171" lastFinishedPulling="2025-09-05 23:54:05.517164407 +0000 UTC m=+9.098227018" observedRunningTime="2025-09-05 23:54:05.78656496 +0000 UTC m=+9.367627619" watchObservedRunningTime="2025-09-05 23:54:06.672965665 +0000 UTC m=+10.254028300" Sep 5 23:54:14.165897 sudo[2355]: pam_unix(sudo:session): session closed for user root Sep 5 23:54:14.190442 sshd[2352]: pam_unix(sshd:session): session closed for user core Sep 5 23:54:14.198172 systemd[1]: sshd@8-172.31.18.129:22-139.178.68.195:38810.service: Deactivated successfully. Sep 5 23:54:14.203314 systemd[1]: session-9.scope: Deactivated successfully. Sep 5 23:54:14.204463 systemd[1]: session-9.scope: Consumed 12.494s CPU time, 149.9M memory peak, 0B memory swap peak. Sep 5 23:54:14.209624 systemd-logind[1997]: Session 9 logged out. Waiting for processes to exit. Sep 5 23:54:14.213479 systemd-logind[1997]: Removed session 9. Sep 5 23:54:26.651287 systemd[1]: Created slice kubepods-besteffort-podeaccab41_7fae_4200_bf3c_b4d9b8e9a194.slice - libcontainer container kubepods-besteffort-podeaccab41_7fae_4200_bf3c_b4d9b8e9a194.slice. Sep 5 23:54:26.702570 kubelet[3339]: I0905 23:54:26.702257 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaccab41-7fae-4200-bf3c-b4d9b8e9a194-tigera-ca-bundle\") pod \"calico-typha-5d847755f9-5rfbw\" (UID: \"eaccab41-7fae-4200-bf3c-b4d9b8e9a194\") " pod="calico-system/calico-typha-5d847755f9-5rfbw" Sep 5 23:54:26.702570 kubelet[3339]: I0905 23:54:26.702333 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/eaccab41-7fae-4200-bf3c-b4d9b8e9a194-typha-certs\") pod \"calico-typha-5d847755f9-5rfbw\" (UID: \"eaccab41-7fae-4200-bf3c-b4d9b8e9a194\") " pod="calico-system/calico-typha-5d847755f9-5rfbw" Sep 5 23:54:26.702570 kubelet[3339]: I0905 23:54:26.702380 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrtc\" (UniqueName: \"kubernetes.io/projected/eaccab41-7fae-4200-bf3c-b4d9b8e9a194-kube-api-access-lmrtc\") pod \"calico-typha-5d847755f9-5rfbw\" (UID: \"eaccab41-7fae-4200-bf3c-b4d9b8e9a194\") " pod="calico-system/calico-typha-5d847755f9-5rfbw" Sep 5 23:54:26.963946 containerd[2021]: time="2025-09-05T23:54:26.963411669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d847755f9-5rfbw,Uid:eaccab41-7fae-4200-bf3c-b4d9b8e9a194,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:27.058704 containerd[2021]: time="2025-09-05T23:54:27.053464290Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:27.064182 containerd[2021]: time="2025-09-05T23:54:27.060184806Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:27.064182 containerd[2021]: time="2025-09-05T23:54:27.063673494Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:27.064705 containerd[2021]: time="2025-09-05T23:54:27.064486038Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:27.105547 kubelet[3339]: I0905 23:54:27.104168 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-flexvol-driver-host\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105547 kubelet[3339]: I0905 23:54:27.104235 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/98bfe547-aa27-459a-ab76-a1e2797a7553-node-certs\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105547 kubelet[3339]: I0905 23:54:27.104276 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-policysync\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105547 kubelet[3339]: I0905 23:54:27.104315 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-var-lib-calico\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105547 kubelet[3339]: I0905 23:54:27.104352 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snpp\" (UniqueName: \"kubernetes.io/projected/98bfe547-aa27-459a-ab76-a1e2797a7553-kube-api-access-4snpp\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105926 kubelet[3339]: I0905 23:54:27.104392 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-lib-modules\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105926 kubelet[3339]: I0905 23:54:27.104425 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-cni-bin-dir\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105926 kubelet[3339]: I0905 23:54:27.104464 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-cni-log-dir\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105926 kubelet[3339]: I0905 23:54:27.104642 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-cni-net-dir\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.105926 kubelet[3339]: I0905 23:54:27.104698 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98bfe547-aa27-459a-ab76-a1e2797a7553-tigera-ca-bundle\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.108561 kubelet[3339]: I0905 23:54:27.104750 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-var-run-calico\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.108561 kubelet[3339]: I0905 23:54:27.104800 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/98bfe547-aa27-459a-ab76-a1e2797a7553-xtables-lock\") pod \"calico-node-476k5\" (UID: \"98bfe547-aa27-459a-ab76-a1e2797a7553\") " pod="calico-system/calico-node-476k5" Sep 5 23:54:27.106319 systemd[1]: Created slice kubepods-besteffort-pod98bfe547_aa27_459a_ab76_a1e2797a7553.slice - libcontainer container kubepods-besteffort-pod98bfe547_aa27_459a_ab76_a1e2797a7553.slice. Sep 5 23:54:27.149891 systemd[1]: Started cri-containerd-ea9d7414d36633ab1fb6c5805ba9f5956ed38286cf565dde6712646877fbdc68.scope - libcontainer container ea9d7414d36633ab1fb6c5805ba9f5956ed38286cf565dde6712646877fbdc68. Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.212771 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.214496 kubelet[3339]: W0905 23:54:27.212819 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.212858 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.213279 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.214496 kubelet[3339]: W0905 23:54:27.213302 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.213329 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.213693 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.214496 kubelet[3339]: W0905 23:54:27.213712 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.214496 kubelet[3339]: E0905 23:54:27.213734 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.215092 kubelet[3339]: E0905 23:54:27.214492 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.215092 kubelet[3339]: W0905 23:54:27.214585 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.215092 kubelet[3339]: E0905 23:54:27.214616 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.218566 kubelet[3339]: E0905 23:54:27.215433 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.218566 kubelet[3339]: W0905 23:54:27.215466 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.218566 kubelet[3339]: E0905 23:54:27.215494 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.218566 kubelet[3339]: E0905 23:54:27.216797 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.218566 kubelet[3339]: W0905 23:54:27.216823 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.218566 kubelet[3339]: E0905 23:54:27.216850 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.219015 kubelet[3339]: E0905 23:54:27.218602 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.219015 kubelet[3339]: W0905 23:54:27.218626 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.219015 kubelet[3339]: E0905 23:54:27.218652 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.219168 kubelet[3339]: E0905 23:54:27.219105 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.219168 kubelet[3339]: W0905 23:54:27.219122 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.219168 kubelet[3339]: E0905 23:54:27.219146 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.219473 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.220415 kubelet[3339]: W0905 23:54:27.219510 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.219607 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.219922 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.220415 kubelet[3339]: W0905 23:54:27.219941 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.219962 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.220240 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.220415 kubelet[3339]: W0905 23:54:27.220256 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.220415 kubelet[3339]: E0905 23:54:27.220276 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.220978 kubelet[3339]: E0905 23:54:27.220699 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.220978 kubelet[3339]: W0905 23:54:27.220718 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.220978 kubelet[3339]: E0905 23:54:27.220739 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.222237 kubelet[3339]: E0905 23:54:27.221742 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.222237 kubelet[3339]: W0905 23:54:27.221771 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.222237 kubelet[3339]: E0905 23:54:27.221801 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.222400 kubelet[3339]: E0905 23:54:27.222301 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.222400 kubelet[3339]: W0905 23:54:27.222321 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.222400 kubelet[3339]: E0905 23:54:27.222345 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.223439 kubelet[3339]: E0905 23:54:27.222843 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.223439 kubelet[3339]: W0905 23:54:27.222877 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.223439 kubelet[3339]: E0905 23:54:27.222907 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.226617 kubelet[3339]: E0905 23:54:27.226055 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.226617 kubelet[3339]: W0905 23:54:27.226548 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.226617 kubelet[3339]: E0905 23:54:27.226600 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.228965 kubelet[3339]: E0905 23:54:27.228910 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.228965 kubelet[3339]: W0905 23:54:27.228946 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.228965 kubelet[3339]: E0905 23:54:27.228991 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.229627 kubelet[3339]: E0905 23:54:27.229418 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.229627 kubelet[3339]: W0905 23:54:27.229448 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.229627 kubelet[3339]: E0905 23:54:27.229484 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.230120 kubelet[3339]: E0905 23:54:27.229842 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.230120 kubelet[3339]: W0905 23:54:27.229874 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.230120 kubelet[3339]: E0905 23:54:27.229897 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.230431 kubelet[3339]: E0905 23:54:27.230163 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.230431 kubelet[3339]: W0905 23:54:27.230179 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.230431 kubelet[3339]: E0905 23:54:27.230338 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.231063 kubelet[3339]: E0905 23:54:27.230579 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.231063 kubelet[3339]: W0905 23:54:27.230595 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.231063 kubelet[3339]: E0905 23:54:27.230834 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.231063 kubelet[3339]: E0905 23:54:27.230872 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.231063 kubelet[3339]: W0905 23:54:27.230886 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.231063 kubelet[3339]: E0905 23:54:27.230986 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.232106 kubelet[3339]: E0905 23:54:27.231266 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.232106 kubelet[3339]: W0905 23:54:27.231281 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.232106 kubelet[3339]: E0905 23:54:27.231632 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.232106 kubelet[3339]: E0905 23:54:27.231635 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.232106 kubelet[3339]: W0905 23:54:27.231647 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.232106 kubelet[3339]: E0905 23:54:27.231668 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.233508 kubelet[3339]: E0905 23:54:27.233122 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.233508 kubelet[3339]: W0905 23:54:27.233155 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.233508 kubelet[3339]: E0905 23:54:27.233201 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.234189 kubelet[3339]: E0905 23:54:27.233681 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.234189 kubelet[3339]: W0905 23:54:27.233701 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.234189 kubelet[3339]: E0905 23:54:27.233738 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.234912 kubelet[3339]: E0905 23:54:27.234676 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.234912 kubelet[3339]: W0905 23:54:27.234712 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.234912 kubelet[3339]: E0905 23:54:27.234880 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.235535 kubelet[3339]: E0905 23:54:27.235142 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.235535 kubelet[3339]: W0905 23:54:27.235173 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.235535 kubelet[3339]: E0905 23:54:27.235305 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.235535 kubelet[3339]: E0905 23:54:27.235477 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.235535 kubelet[3339]: W0905 23:54:27.235491 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.236653 kubelet[3339]: E0905 23:54:27.235901 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.237157 kubelet[3339]: E0905 23:54:27.237114 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.237442 kubelet[3339]: W0905 23:54:27.237194 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.237582 kubelet[3339]: E0905 23:54:27.237465 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.238600 kubelet[3339]: E0905 23:54:27.238551 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.238600 kubelet[3339]: W0905 23:54:27.238590 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.238773 kubelet[3339]: E0905 23:54:27.238637 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.240570 kubelet[3339]: E0905 23:54:27.239342 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.240570 kubelet[3339]: W0905 23:54:27.239423 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.240570 kubelet[3339]: E0905 23:54:27.239455 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.240570 kubelet[3339]: E0905 23:54:27.240167 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.240570 kubelet[3339]: W0905 23:54:27.240191 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.240570 kubelet[3339]: E0905 23:54:27.240217 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.246539 kubelet[3339]: E0905 23:54:27.245289 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.246539 kubelet[3339]: W0905 23:54:27.245331 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.246539 kubelet[3339]: E0905 23:54:27.245364 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.250285 kubelet[3339]: E0905 23:54:27.249659 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.250285 kubelet[3339]: W0905 23:54:27.249691 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.250285 kubelet[3339]: E0905 23:54:27.249721 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.322699 containerd[2021]: time="2025-09-05T23:54:27.322626931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d847755f9-5rfbw,Uid:eaccab41-7fae-4200-bf3c-b4d9b8e9a194,Namespace:calico-system,Attempt:0,} returns sandbox id \"ea9d7414d36633ab1fb6c5805ba9f5956ed38286cf565dde6712646877fbdc68\"" Sep 5 23:54:27.328840 containerd[2021]: time="2025-09-05T23:54:27.328761499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 5 23:54:27.417852 containerd[2021]: time="2025-09-05T23:54:27.417611744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-476k5,Uid:98bfe547-aa27-459a-ab76-a1e2797a7553,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:27.445611 kubelet[3339]: E0905 23:54:27.442326 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:27.494267 containerd[2021]: time="2025-09-05T23:54:27.492623480Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:27.494267 containerd[2021]: time="2025-09-05T23:54:27.492717440Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:27.494267 containerd[2021]: time="2025-09-05T23:54:27.492743684Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:27.494267 containerd[2021]: time="2025-09-05T23:54:27.492900740Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.494828 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499213 kubelet[3339]: W0905 23:54:27.494856 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.494885 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.496793 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499213 kubelet[3339]: W0905 23:54:27.496816 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.496844 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.497762 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499213 kubelet[3339]: W0905 23:54:27.497785 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.497814 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.499213 kubelet[3339]: E0905 23:54:27.498128 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499810 kubelet[3339]: W0905 23:54:27.498145 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499810 kubelet[3339]: E0905 23:54:27.498168 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.499810 kubelet[3339]: E0905 23:54:27.498465 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499810 kubelet[3339]: W0905 23:54:27.498483 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499810 kubelet[3339]: E0905 23:54:27.498509 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.499810 kubelet[3339]: E0905 23:54:27.498862 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.499810 kubelet[3339]: W0905 23:54:27.498880 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.499810 kubelet[3339]: E0905 23:54:27.498900 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.504251 kubelet[3339]: E0905 23:54:27.504112 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.504419 kubelet[3339]: W0905 23:54:27.504284 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.504419 kubelet[3339]: E0905 23:54:27.504326 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.504881 kubelet[3339]: E0905 23:54:27.504826 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.504881 kubelet[3339]: W0905 23:54:27.504848 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.504881 kubelet[3339]: E0905 23:54:27.504872 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.507044 kubelet[3339]: E0905 23:54:27.506997 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.507044 kubelet[3339]: W0905 23:54:27.507040 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.507606 kubelet[3339]: E0905 23:54:27.507075 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.507911 kubelet[3339]: E0905 23:54:27.507869 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.507911 kubelet[3339]: W0905 23:54:27.507904 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.508028 kubelet[3339]: E0905 23:54:27.507935 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.510631 kubelet[3339]: E0905 23:54:27.510504 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.510631 kubelet[3339]: W0905 23:54:27.510620 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.511427 kubelet[3339]: E0905 23:54:27.510655 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.513104 kubelet[3339]: E0905 23:54:27.512709 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.513104 kubelet[3339]: W0905 23:54:27.512749 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.513104 kubelet[3339]: E0905 23:54:27.512783 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.515065 kubelet[3339]: E0905 23:54:27.514302 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.515065 kubelet[3339]: W0905 23:54:27.514390 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.515065 kubelet[3339]: E0905 23:54:27.514425 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.516574 kubelet[3339]: E0905 23:54:27.515102 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.516574 kubelet[3339]: W0905 23:54:27.515139 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.516574 kubelet[3339]: E0905 23:54:27.515198 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.516574 kubelet[3339]: E0905 23:54:27.515710 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.516574 kubelet[3339]: W0905 23:54:27.515732 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.516574 kubelet[3339]: E0905 23:54:27.515785 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.518297 kubelet[3339]: E0905 23:54:27.516469 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.518297 kubelet[3339]: W0905 23:54:27.517414 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.518297 kubelet[3339]: E0905 23:54:27.517447 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.519913 kubelet[3339]: E0905 23:54:27.519564 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.519913 kubelet[3339]: W0905 23:54:27.519611 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.519913 kubelet[3339]: E0905 23:54:27.519888 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.521195 kubelet[3339]: E0905 23:54:27.521145 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.521195 kubelet[3339]: W0905 23:54:27.521189 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.522058 kubelet[3339]: E0905 23:54:27.521221 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.522968 kubelet[3339]: E0905 23:54:27.522920 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.522968 kubelet[3339]: W0905 23:54:27.522962 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.523479 kubelet[3339]: E0905 23:54:27.522996 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.524282 kubelet[3339]: E0905 23:54:27.524232 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.524282 kubelet[3339]: W0905 23:54:27.524269 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.524639 kubelet[3339]: E0905 23:54:27.524300 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.526301 kubelet[3339]: E0905 23:54:27.526256 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.526301 kubelet[3339]: W0905 23:54:27.526293 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.526652 kubelet[3339]: E0905 23:54:27.526327 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.526652 kubelet[3339]: I0905 23:54:27.526382 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c05df7ab-c8cc-4af6-b78b-e2da00b65212-socket-dir\") pod \"csi-node-driver-xr256\" (UID: \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\") " pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:27.528286 kubelet[3339]: E0905 23:54:27.528165 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.528286 kubelet[3339]: W0905 23:54:27.528214 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.528286 kubelet[3339]: E0905 23:54:27.528283 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.528817 kubelet[3339]: I0905 23:54:27.528334 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c05df7ab-c8cc-4af6-b78b-e2da00b65212-varrun\") pod \"csi-node-driver-xr256\" (UID: \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\") " pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:27.530651 kubelet[3339]: E0905 23:54:27.529910 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.530651 kubelet[3339]: W0905 23:54:27.529945 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.530651 kubelet[3339]: E0905 23:54:27.529992 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.532043 kubelet[3339]: E0905 23:54:27.532005 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.532786 kubelet[3339]: W0905 23:54:27.532370 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.532786 kubelet[3339]: E0905 23:54:27.532731 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.534912 kubelet[3339]: E0905 23:54:27.534627 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.534912 kubelet[3339]: W0905 23:54:27.534662 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.536327 kubelet[3339]: E0905 23:54:27.536031 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.536327 kubelet[3339]: I0905 23:54:27.536125 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-929gw\" (UniqueName: \"kubernetes.io/projected/c05df7ab-c8cc-4af6-b78b-e2da00b65212-kube-api-access-929gw\") pod \"csi-node-driver-xr256\" (UID: \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\") " pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:27.537755 kubelet[3339]: E0905 23:54:27.536803 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.537755 kubelet[3339]: W0905 23:54:27.536828 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.537755 kubelet[3339]: E0905 23:54:27.536874 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.538739 kubelet[3339]: E0905 23:54:27.538690 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.539876 kubelet[3339]: W0905 23:54:27.538920 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.539876 kubelet[3339]: E0905 23:54:27.538985 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.540463 kubelet[3339]: E0905 23:54:27.540192 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.541339 kubelet[3339]: W0905 23:54:27.541200 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.541339 kubelet[3339]: E0905 23:54:27.541268 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.542879 kubelet[3339]: I0905 23:54:27.542636 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c05df7ab-c8cc-4af6-b78b-e2da00b65212-registration-dir\") pod \"csi-node-driver-xr256\" (UID: \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\") " pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:27.542879 kubelet[3339]: E0905 23:54:27.542749 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.542879 kubelet[3339]: W0905 23:54:27.542771 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.542879 kubelet[3339]: E0905 23:54:27.542799 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.545309 kubelet[3339]: E0905 23:54:27.545183 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.545309 kubelet[3339]: W0905 23:54:27.545224 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.545309 kubelet[3339]: E0905 23:54:27.545260 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.547496 kubelet[3339]: E0905 23:54:27.547358 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.547496 kubelet[3339]: W0905 23:54:27.547392 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.549710 kubelet[3339]: E0905 23:54:27.549627 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.549992 kubelet[3339]: I0905 23:54:27.549730 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c05df7ab-c8cc-4af6-b78b-e2da00b65212-kubelet-dir\") pod \"csi-node-driver-xr256\" (UID: \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\") " pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:27.552671 kubelet[3339]: E0905 23:54:27.552581 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.552671 kubelet[3339]: W0905 23:54:27.552617 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.553319 kubelet[3339]: E0905 23:54:27.553216 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.554805 kubelet[3339]: E0905 23:54:27.554760 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.554805 kubelet[3339]: W0905 23:54:27.554795 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.555011 kubelet[3339]: E0905 23:54:27.554852 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.556759 kubelet[3339]: E0905 23:54:27.556713 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.556759 kubelet[3339]: W0905 23:54:27.556750 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.556978 kubelet[3339]: E0905 23:54:27.556783 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.558707 kubelet[3339]: E0905 23:54:27.558657 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.558707 kubelet[3339]: W0905 23:54:27.558695 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.558870 kubelet[3339]: E0905 23:54:27.558729 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.575828 systemd[1]: Started cri-containerd-013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e.scope - libcontainer container 013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e. Sep 5 23:54:27.651697 kubelet[3339]: E0905 23:54:27.651465 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.651697 kubelet[3339]: W0905 23:54:27.651686 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.652002 kubelet[3339]: E0905 23:54:27.651862 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.654348 kubelet[3339]: E0905 23:54:27.654289 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.654348 kubelet[3339]: W0905 23:54:27.654335 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.654348 kubelet[3339]: E0905 23:54:27.654368 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.655958 kubelet[3339]: E0905 23:54:27.655882 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.655958 kubelet[3339]: W0905 23:54:27.655923 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.656686 kubelet[3339]: E0905 23:54:27.656641 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.657932 kubelet[3339]: E0905 23:54:27.657870 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.657932 kubelet[3339]: W0905 23:54:27.657909 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.659689 kubelet[3339]: E0905 23:54:27.658946 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.660074 kubelet[3339]: E0905 23:54:27.660027 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.660159 kubelet[3339]: W0905 23:54:27.660096 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.660307 kubelet[3339]: E0905 23:54:27.660268 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.661087 kubelet[3339]: E0905 23:54:27.661019 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.661087 kubelet[3339]: W0905 23:54:27.661069 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.661765 kubelet[3339]: E0905 23:54:27.661186 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.662583 kubelet[3339]: E0905 23:54:27.661881 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.662583 kubelet[3339]: W0905 23:54:27.661916 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.662583 kubelet[3339]: E0905 23:54:27.662032 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.662583 kubelet[3339]: E0905 23:54:27.662365 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.662583 kubelet[3339]: W0905 23:54:27.662383 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.662583 kubelet[3339]: E0905 23:54:27.662473 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.662974 kubelet[3339]: E0905 23:54:27.662814 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.662974 kubelet[3339]: W0905 23:54:27.662831 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.664298 kubelet[3339]: E0905 23:54:27.664193 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.665424 kubelet[3339]: E0905 23:54:27.665353 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.665424 kubelet[3339]: W0905 23:54:27.665405 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.665799 kubelet[3339]: E0905 23:54:27.665710 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.667138 kubelet[3339]: E0905 23:54:27.666765 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.667138 kubelet[3339]: W0905 23:54:27.666827 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.667138 kubelet[3339]: E0905 23:54:27.666991 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.670727 kubelet[3339]: E0905 23:54:27.670281 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.670727 kubelet[3339]: W0905 23:54:27.670363 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.670960 kubelet[3339]: E0905 23:54:27.670749 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.671847 kubelet[3339]: E0905 23:54:27.671768 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.671847 kubelet[3339]: W0905 23:54:27.671827 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.672424 kubelet[3339]: E0905 23:54:27.672165 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.673971 kubelet[3339]: E0905 23:54:27.673916 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.673971 kubelet[3339]: W0905 23:54:27.673959 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.676046 kubelet[3339]: E0905 23:54:27.675979 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.677762 kubelet[3339]: E0905 23:54:27.677706 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.677762 kubelet[3339]: W0905 23:54:27.677747 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.678722 kubelet[3339]: E0905 23:54:27.677951 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.679734 kubelet[3339]: E0905 23:54:27.679669 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.679734 kubelet[3339]: W0905 23:54:27.679722 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.680662 kubelet[3339]: E0905 23:54:27.680603 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.681898 kubelet[3339]: E0905 23:54:27.681841 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.681898 kubelet[3339]: W0905 23:54:27.681883 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.682097 kubelet[3339]: E0905 23:54:27.682018 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.684333 kubelet[3339]: E0905 23:54:27.684276 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.684333 kubelet[3339]: W0905 23:54:27.684317 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.686540 kubelet[3339]: E0905 23:54:27.686455 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.687344 kubelet[3339]: E0905 23:54:27.687288 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.687344 kubelet[3339]: W0905 23:54:27.687329 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.687513 kubelet[3339]: E0905 23:54:27.687458 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.689301 kubelet[3339]: E0905 23:54:27.689130 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.689301 kubelet[3339]: W0905 23:54:27.689182 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.690114 kubelet[3339]: E0905 23:54:27.689727 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.692549 kubelet[3339]: E0905 23:54:27.692366 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.692549 kubelet[3339]: W0905 23:54:27.692462 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.694417 kubelet[3339]: E0905 23:54:27.693091 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.695387 kubelet[3339]: E0905 23:54:27.694995 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.695387 kubelet[3339]: W0905 23:54:27.695029 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.695387 kubelet[3339]: E0905 23:54:27.695096 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.701557 kubelet[3339]: E0905 23:54:27.697651 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.701557 kubelet[3339]: W0905 23:54:27.697685 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.701557 kubelet[3339]: E0905 23:54:27.697747 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.708027 kubelet[3339]: E0905 23:54:27.707670 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.708027 kubelet[3339]: W0905 23:54:27.707706 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.708027 kubelet[3339]: E0905 23:54:27.707754 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.709988 kubelet[3339]: E0905 23:54:27.708823 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.709988 kubelet[3339]: W0905 23:54:27.708855 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.709988 kubelet[3339]: E0905 23:54:27.708886 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.763627 kubelet[3339]: E0905 23:54:27.763572 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:27.763627 kubelet[3339]: W0905 23:54:27.763614 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:27.763627 kubelet[3339]: E0905 23:54:27.763661 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:27.778654 containerd[2021]: time="2025-09-05T23:54:27.778581262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-476k5,Uid:98bfe547-aa27-459a-ab76-a1e2797a7553,Namespace:calico-system,Attempt:0,} returns sandbox id \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\"" Sep 5 23:54:28.655157 kubelet[3339]: E0905 23:54:28.653871 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:28.908767 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4101814495.mount: Deactivated successfully. Sep 5 23:54:29.864466 containerd[2021]: time="2025-09-05T23:54:29.864398904Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:29.865994 containerd[2021]: time="2025-09-05T23:54:29.865942428Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 5 23:54:29.867576 containerd[2021]: time="2025-09-05T23:54:29.866899428Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:29.871569 containerd[2021]: time="2025-09-05T23:54:29.871126860Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:29.874245 containerd[2021]: time="2025-09-05T23:54:29.874058736Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.545234861s" Sep 5 23:54:29.874245 containerd[2021]: time="2025-09-05T23:54:29.874116000Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 5 23:54:29.876947 containerd[2021]: time="2025-09-05T23:54:29.876666708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 5 23:54:29.913153 containerd[2021]: time="2025-09-05T23:54:29.912780756Z" level=info msg="CreateContainer within sandbox \"ea9d7414d36633ab1fb6c5805ba9f5956ed38286cf565dde6712646877fbdc68\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 5 23:54:29.936735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2246616296.mount: Deactivated successfully. Sep 5 23:54:29.941135 containerd[2021]: time="2025-09-05T23:54:29.940357608Z" level=info msg="CreateContainer within sandbox \"ea9d7414d36633ab1fb6c5805ba9f5956ed38286cf565dde6712646877fbdc68\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ff4dcc3ce1d2baea44c992c7028d064487b2c3173701bdd2c6270133d9dc4bdf\"" Sep 5 23:54:29.941795 containerd[2021]: time="2025-09-05T23:54:29.941727324Z" level=info msg="StartContainer for \"ff4dcc3ce1d2baea44c992c7028d064487b2c3173701bdd2c6270133d9dc4bdf\"" Sep 5 23:54:29.999853 systemd[1]: Started cri-containerd-ff4dcc3ce1d2baea44c992c7028d064487b2c3173701bdd2c6270133d9dc4bdf.scope - libcontainer container ff4dcc3ce1d2baea44c992c7028d064487b2c3173701bdd2c6270133d9dc4bdf. Sep 5 23:54:30.067564 containerd[2021]: time="2025-09-05T23:54:30.067250469Z" level=info msg="StartContainer for \"ff4dcc3ce1d2baea44c992c7028d064487b2c3173701bdd2c6270133d9dc4bdf\" returns successfully" Sep 5 23:54:30.653132 kubelet[3339]: E0905 23:54:30.653051 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:30.896753 kubelet[3339]: I0905 23:54:30.894917 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d847755f9-5rfbw" podStartSLOduration=2.347209868 podStartE2EDuration="4.894895381s" podCreationTimestamp="2025-09-05 23:54:26 +0000 UTC" firstStartedPulling="2025-09-05 23:54:27.327893395 +0000 UTC m=+30.908956018" lastFinishedPulling="2025-09-05 23:54:29.87557892 +0000 UTC m=+33.456641531" observedRunningTime="2025-09-05 23:54:30.894764425 +0000 UTC m=+34.475827060" watchObservedRunningTime="2025-09-05 23:54:30.894895381 +0000 UTC m=+34.475958004" Sep 5 23:54:30.958132 kubelet[3339]: E0905 23:54:30.957667 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.958132 kubelet[3339]: W0905 23:54:30.957706 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.958132 kubelet[3339]: E0905 23:54:30.957740 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.960314 kubelet[3339]: E0905 23:54:30.960045 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.960314 kubelet[3339]: W0905 23:54:30.960080 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.960314 kubelet[3339]: E0905 23:54:30.960112 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.962557 kubelet[3339]: E0905 23:54:30.960685 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.962995 kubelet[3339]: W0905 23:54:30.962726 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.962995 kubelet[3339]: E0905 23:54:30.962776 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.963307 kubelet[3339]: E0905 23:54:30.963271 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.963417 kubelet[3339]: W0905 23:54:30.963392 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.963551 kubelet[3339]: E0905 23:54:30.963505 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.964298 kubelet[3339]: E0905 23:54:30.964267 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.964454 kubelet[3339]: W0905 23:54:30.964429 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.964721 kubelet[3339]: E0905 23:54:30.964583 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.965440 kubelet[3339]: E0905 23:54:30.965407 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.965978 kubelet[3339]: W0905 23:54:30.965619 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.965978 kubelet[3339]: E0905 23:54:30.965656 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.966291 kubelet[3339]: E0905 23:54:30.966267 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.966537 kubelet[3339]: W0905 23:54:30.966394 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.966537 kubelet[3339]: E0905 23:54:30.966426 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.967098 kubelet[3339]: E0905 23:54:30.967074 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.967305 kubelet[3339]: W0905 23:54:30.967182 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.967305 kubelet[3339]: E0905 23:54:30.967213 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.967829 kubelet[3339]: E0905 23:54:30.967807 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.968029 kubelet[3339]: W0905 23:54:30.967932 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.968029 kubelet[3339]: E0905 23:54:30.967960 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.968638 kubelet[3339]: E0905 23:54:30.968439 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.968638 kubelet[3339]: W0905 23:54:30.968461 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.968638 kubelet[3339]: E0905 23:54:30.968482 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.968961 kubelet[3339]: E0905 23:54:30.968940 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.969163 kubelet[3339]: W0905 23:54:30.969053 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.969163 kubelet[3339]: E0905 23:54:30.969084 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.971784 kubelet[3339]: E0905 23:54:30.971642 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.971784 kubelet[3339]: W0905 23:54:30.971700 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.971784 kubelet[3339]: E0905 23:54:30.971735 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.973471 kubelet[3339]: E0905 23:54:30.972922 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.973471 kubelet[3339]: W0905 23:54:30.972955 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.973471 kubelet[3339]: E0905 23:54:30.972986 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.975303 kubelet[3339]: E0905 23:54:30.974887 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.975303 kubelet[3339]: W0905 23:54:30.975036 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.975303 kubelet[3339]: E0905 23:54:30.975070 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.977084 kubelet[3339]: E0905 23:54:30.976765 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.977084 kubelet[3339]: W0905 23:54:30.976918 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.977084 kubelet[3339]: E0905 23:54:30.976951 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.993836 kubelet[3339]: E0905 23:54:30.993621 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.993836 kubelet[3339]: W0905 23:54:30.993675 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.993836 kubelet[3339]: E0905 23:54:30.993721 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.994426 kubelet[3339]: E0905 23:54:30.994381 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.994426 kubelet[3339]: W0905 23:54:30.994415 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.994872 kubelet[3339]: E0905 23:54:30.994454 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.995050 kubelet[3339]: E0905 23:54:30.995020 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.995168 kubelet[3339]: W0905 23:54:30.995050 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.995168 kubelet[3339]: E0905 23:54:30.995090 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.995605 kubelet[3339]: E0905 23:54:30.995573 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.995726 kubelet[3339]: W0905 23:54:30.995604 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.995726 kubelet[3339]: E0905 23:54:30.995714 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.996044 kubelet[3339]: E0905 23:54:30.996018 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.996130 kubelet[3339]: W0905 23:54:30.996044 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.996130 kubelet[3339]: E0905 23:54:30.996086 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.996570 kubelet[3339]: E0905 23:54:30.996509 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.996681 kubelet[3339]: W0905 23:54:30.996569 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.996681 kubelet[3339]: E0905 23:54:30.996607 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.997433 kubelet[3339]: E0905 23:54:30.997168 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.997433 kubelet[3339]: W0905 23:54:30.997196 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.997433 kubelet[3339]: E0905 23:54:30.997232 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.997756 kubelet[3339]: E0905 23:54:30.997736 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.997901 kubelet[3339]: W0905 23:54:30.997839 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.998198 kubelet[3339]: E0905 23:54:30.998023 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.998591 kubelet[3339]: E0905 23:54:30.998374 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.998591 kubelet[3339]: W0905 23:54:30.998398 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.998591 kubelet[3339]: E0905 23:54:30.998440 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.999177 kubelet[3339]: E0905 23:54:30.999018 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.999177 kubelet[3339]: W0905 23:54:30.999043 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.999177 kubelet[3339]: E0905 23:54:30.999093 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:30.999714 kubelet[3339]: E0905 23:54:30.999690 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:30.999981 kubelet[3339]: W0905 23:54:30.999801 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:30.999981 kubelet[3339]: E0905 23:54:30.999853 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.000194 kubelet[3339]: E0905 23:54:31.000173 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.000296 kubelet[3339]: W0905 23:54:31.000274 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.000552 kubelet[3339]: E0905 23:54:31.000467 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.001149 kubelet[3339]: E0905 23:54:31.000913 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.001149 kubelet[3339]: W0905 23:54:31.000937 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.001149 kubelet[3339]: E0905 23:54:31.000977 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.001426 kubelet[3339]: E0905 23:54:31.001406 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.001572 kubelet[3339]: W0905 23:54:31.001502 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.001945 kubelet[3339]: E0905 23:54:31.001698 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.002146 kubelet[3339]: E0905 23:54:31.002124 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.002256 kubelet[3339]: W0905 23:54:31.002234 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.002399 kubelet[3339]: E0905 23:54:31.002360 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.003038 kubelet[3339]: E0905 23:54:31.003014 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.003159 kubelet[3339]: W0905 23:54:31.003137 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.003577 kubelet[3339]: E0905 23:54:31.003280 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.003775 kubelet[3339]: E0905 23:54:31.003732 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.003775 kubelet[3339]: W0905 23:54:31.003768 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.003904 kubelet[3339]: E0905 23:54:31.003796 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.004626 kubelet[3339]: E0905 23:54:31.004596 3339 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:54:31.004824 kubelet[3339]: W0905 23:54:31.004743 3339 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:54:31.004824 kubelet[3339]: E0905 23:54:31.004777 3339 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:54:31.172594 containerd[2021]: time="2025-09-05T23:54:31.171910042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:31.174602 containerd[2021]: time="2025-09-05T23:54:31.174216130Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 5 23:54:31.178550 containerd[2021]: time="2025-09-05T23:54:31.176757502Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:31.182051 containerd[2021]: time="2025-09-05T23:54:31.181992634Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:31.183486 containerd[2021]: time="2025-09-05T23:54:31.183418450Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.30669497s" Sep 5 23:54:31.183612 containerd[2021]: time="2025-09-05T23:54:31.183484642Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 5 23:54:31.189941 containerd[2021]: time="2025-09-05T23:54:31.189826102Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 5 23:54:31.219285 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3747425376.mount: Deactivated successfully. Sep 5 23:54:31.224702 containerd[2021]: time="2025-09-05T23:54:31.224640911Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f\"" Sep 5 23:54:31.226807 containerd[2021]: time="2025-09-05T23:54:31.226732403Z" level=info msg="StartContainer for \"f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f\"" Sep 5 23:54:31.297140 systemd[1]: Started cri-containerd-f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f.scope - libcontainer container f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f. Sep 5 23:54:31.356497 containerd[2021]: time="2025-09-05T23:54:31.356342363Z" level=info msg="StartContainer for \"f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f\" returns successfully" Sep 5 23:54:31.385962 systemd[1]: cri-containerd-f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f.scope: Deactivated successfully. Sep 5 23:54:31.441884 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f-rootfs.mount: Deactivated successfully. Sep 5 23:54:31.786908 containerd[2021]: time="2025-09-05T23:54:31.786828253Z" level=info msg="shim disconnected" id=f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f namespace=k8s.io Sep 5 23:54:31.787177 containerd[2021]: time="2025-09-05T23:54:31.786966733Z" level=warning msg="cleaning up after shim disconnected" id=f984f8d7ad23043f286f0f53896cf87c8946820e95e0799706301c5e6013e51f namespace=k8s.io Sep 5 23:54:31.787177 containerd[2021]: time="2025-09-05T23:54:31.786992545Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:54:31.886751 containerd[2021]: time="2025-09-05T23:54:31.886190282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 5 23:54:32.655669 kubelet[3339]: E0905 23:54:32.653721 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:34.653630 kubelet[3339]: E0905 23:54:34.653106 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:34.826140 containerd[2021]: time="2025-09-05T23:54:34.826071761Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:34.828777 containerd[2021]: time="2025-09-05T23:54:34.828688937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 5 23:54:34.829573 containerd[2021]: time="2025-09-05T23:54:34.829290545Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:34.833866 containerd[2021]: time="2025-09-05T23:54:34.833777093Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:34.836581 containerd[2021]: time="2025-09-05T23:54:34.835745453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.949492099s" Sep 5 23:54:34.836581 containerd[2021]: time="2025-09-05T23:54:34.835809617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 5 23:54:34.843838 containerd[2021]: time="2025-09-05T23:54:34.843585845Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 5 23:54:34.867635 containerd[2021]: time="2025-09-05T23:54:34.867454157Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35\"" Sep 5 23:54:34.870942 containerd[2021]: time="2025-09-05T23:54:34.870873677Z" level=info msg="StartContainer for \"cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35\"" Sep 5 23:54:34.872851 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2255688386.mount: Deactivated successfully. Sep 5 23:54:34.932179 systemd[1]: Started cri-containerd-cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35.scope - libcontainer container cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35. Sep 5 23:54:35.001337 containerd[2021]: time="2025-09-05T23:54:35.001100857Z" level=info msg="StartContainer for \"cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35\" returns successfully" Sep 5 23:54:36.062086 containerd[2021]: time="2025-09-05T23:54:36.062008287Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 5 23:54:36.066483 systemd[1]: cri-containerd-cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35.scope: Deactivated successfully. Sep 5 23:54:36.108745 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35-rootfs.mount: Deactivated successfully. Sep 5 23:54:36.140638 kubelet[3339]: I0905 23:54:36.139611 3339 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 5 23:54:36.227144 systemd[1]: Created slice kubepods-burstable-pod0851a585_ee51_4d36_80e5_364195a5c349.slice - libcontainer container kubepods-burstable-pod0851a585_ee51_4d36_80e5_364195a5c349.slice. Sep 5 23:54:36.246387 kubelet[3339]: I0905 23:54:36.241376 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8zn\" (UniqueName: \"kubernetes.io/projected/0851a585-ee51-4d36-80e5-364195a5c349-kube-api-access-2r8zn\") pod \"coredns-7c65d6cfc9-kgk25\" (UID: \"0851a585-ee51-4d36-80e5-364195a5c349\") " pod="kube-system/coredns-7c65d6cfc9-kgk25" Sep 5 23:54:36.246387 kubelet[3339]: I0905 23:54:36.241447 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0851a585-ee51-4d36-80e5-364195a5c349-config-volume\") pod \"coredns-7c65d6cfc9-kgk25\" (UID: \"0851a585-ee51-4d36-80e5-364195a5c349\") " pod="kube-system/coredns-7c65d6cfc9-kgk25" Sep 5 23:54:36.260178 systemd[1]: Created slice kubepods-besteffort-pod91054442_1176_471c_b32d_508eba32633a.slice - libcontainer container kubepods-besteffort-pod91054442_1176_471c_b32d_508eba32633a.slice. Sep 5 23:54:36.284300 systemd[1]: Created slice kubepods-besteffort-podad610cb3_9cf1_433c_bfe6_870f9da7a8a7.slice - libcontainer container kubepods-besteffort-podad610cb3_9cf1_433c_bfe6_870f9da7a8a7.slice. Sep 5 23:54:36.305464 systemd[1]: Created slice kubepods-besteffort-poddd5f7f5e_b0cb_44ef_aa90_0246eeadd9ea.slice - libcontainer container kubepods-besteffort-poddd5f7f5e_b0cb_44ef_aa90_0246eeadd9ea.slice. Sep 5 23:54:36.323969 systemd[1]: Created slice kubepods-besteffort-pod7eabc093_0edd_4719_902f_c28a617adb0c.slice - libcontainer container kubepods-besteffort-pod7eabc093_0edd_4719_902f_c28a617adb0c.slice. Sep 5 23:54:36.345689 kubelet[3339]: I0905 23:54:36.342850 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rsp\" (UniqueName: \"kubernetes.io/projected/91054442-1176-471c-b32d-508eba32633a-kube-api-access-69rsp\") pod \"whisker-5bcc458d6d-445tw\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " pod="calico-system/whisker-5bcc458d6d-445tw" Sep 5 23:54:36.345689 kubelet[3339]: I0905 23:54:36.343663 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhdf\" (UniqueName: \"kubernetes.io/projected/08545de3-6592-4965-ae61-4807250e2870-kube-api-access-sbhdf\") pod \"coredns-7c65d6cfc9-6vhqh\" (UID: \"08545de3-6592-4965-ae61-4807250e2870\") " pod="kube-system/coredns-7c65d6cfc9-6vhqh" Sep 5 23:54:36.345689 kubelet[3339]: I0905 23:54:36.343773 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lckp\" (UniqueName: \"kubernetes.io/projected/a4bb29ad-8766-47fe-9303-a89374119066-kube-api-access-4lckp\") pod \"calico-kube-controllers-5f9f4498b8-wxd6q\" (UID: \"a4bb29ad-8766-47fe-9303-a89374119066\") " pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" Sep 5 23:54:36.345689 kubelet[3339]: I0905 23:54:36.343842 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea-calico-apiserver-certs\") pod \"calico-apiserver-5f77cc885-km8wb\" (UID: \"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea\") " pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" Sep 5 23:54:36.345689 kubelet[3339]: I0905 23:54:36.343909 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/7eabc093-0edd-4719-902f-c28a617adb0c-kube-api-access-qvl99\") pod \"goldmane-7988f88666-s9vkn\" (UID: \"7eabc093-0edd-4719-902f-c28a617adb0c\") " pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:36.348062 kubelet[3339]: I0905 23:54:36.343965 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4bb29ad-8766-47fe-9303-a89374119066-tigera-ca-bundle\") pod \"calico-kube-controllers-5f9f4498b8-wxd6q\" (UID: \"a4bb29ad-8766-47fe-9303-a89374119066\") " pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" Sep 5 23:54:36.348062 kubelet[3339]: I0905 23:54:36.344039 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08545de3-6592-4965-ae61-4807250e2870-config-volume\") pod \"coredns-7c65d6cfc9-6vhqh\" (UID: \"08545de3-6592-4965-ae61-4807250e2870\") " pod="kube-system/coredns-7c65d6cfc9-6vhqh" Sep 5 23:54:36.348062 kubelet[3339]: I0905 23:54:36.345433 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eabc093-0edd-4719-902f-c28a617adb0c-goldmane-ca-bundle\") pod \"goldmane-7988f88666-s9vkn\" (UID: \"7eabc093-0edd-4719-902f-c28a617adb0c\") " pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:36.348062 kubelet[3339]: I0905 23:54:36.346014 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljkd\" (UniqueName: \"kubernetes.io/projected/dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea-kube-api-access-tljkd\") pod \"calico-apiserver-5f77cc885-km8wb\" (UID: \"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea\") " pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" Sep 5 23:54:36.348062 kubelet[3339]: I0905 23:54:36.346119 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eabc093-0edd-4719-902f-c28a617adb0c-config\") pod \"goldmane-7988f88666-s9vkn\" (UID: \"7eabc093-0edd-4719-902f-c28a617adb0c\") " pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:36.346577 systemd[1]: Created slice kubepods-burstable-pod08545de3_6592_4965_ae61_4807250e2870.slice - libcontainer container kubepods-burstable-pod08545de3_6592_4965_ae61_4807250e2870.slice. Sep 5 23:54:36.348471 kubelet[3339]: I0905 23:54:36.346210 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ad610cb3-9cf1-433c-bfe6-870f9da7a8a7-calico-apiserver-certs\") pod \"calico-apiserver-5f77cc885-9q8qs\" (UID: \"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7\") " pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" Sep 5 23:54:36.348471 kubelet[3339]: I0905 23:54:36.346299 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91054442-1176-471c-b32d-508eba32633a-whisker-ca-bundle\") pod \"whisker-5bcc458d6d-445tw\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " pod="calico-system/whisker-5bcc458d6d-445tw" Sep 5 23:54:36.348471 kubelet[3339]: I0905 23:54:36.346485 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84v9\" (UniqueName: \"kubernetes.io/projected/ad610cb3-9cf1-433c-bfe6-870f9da7a8a7-kube-api-access-d84v9\") pod \"calico-apiserver-5f77cc885-9q8qs\" (UID: \"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7\") " pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" Sep 5 23:54:36.348471 kubelet[3339]: I0905 23:54:36.347538 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/91054442-1176-471c-b32d-508eba32633a-whisker-backend-key-pair\") pod \"whisker-5bcc458d6d-445tw\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " pod="calico-system/whisker-5bcc458d6d-445tw" Sep 5 23:54:36.348471 kubelet[3339]: I0905 23:54:36.348128 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7eabc093-0edd-4719-902f-c28a617adb0c-goldmane-key-pair\") pod \"goldmane-7988f88666-s9vkn\" (UID: \"7eabc093-0edd-4719-902f-c28a617adb0c\") " pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:36.362173 systemd[1]: Created slice kubepods-besteffort-poda4bb29ad_8766_47fe_9303_a89374119066.slice - libcontainer container kubepods-besteffort-poda4bb29ad_8766_47fe_9303_a89374119066.slice. Sep 5 23:54:36.538546 containerd[2021]: time="2025-09-05T23:54:36.537682049Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kgk25,Uid:0851a585-ee51-4d36-80e5-364195a5c349,Namespace:kube-system,Attempt:0,}" Sep 5 23:54:36.596466 containerd[2021]: time="2025-09-05T23:54:36.596281817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-9q8qs,Uid:ad610cb3-9cf1-433c-bfe6-870f9da7a8a7,Namespace:calico-apiserver,Attempt:0,}" Sep 5 23:54:36.620556 containerd[2021]: time="2025-09-05T23:54:36.620356361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-km8wb,Uid:dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea,Namespace:calico-apiserver,Attempt:0,}" Sep 5 23:54:36.637712 containerd[2021]: time="2025-09-05T23:54:36.637315398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-s9vkn,Uid:7eabc093-0edd-4719-902f-c28a617adb0c,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:36.640920 containerd[2021]: time="2025-09-05T23:54:36.640845774Z" level=info msg="shim disconnected" id=cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35 namespace=k8s.io Sep 5 23:54:36.641570 containerd[2021]: time="2025-09-05T23:54:36.641307918Z" level=warning msg="cleaning up after shim disconnected" id=cab69e0de241eb9b2d3a0ab8db5e9a3feab0d9175e7dff4a6ef884352d227b35 namespace=k8s.io Sep 5 23:54:36.641570 containerd[2021]: time="2025-09-05T23:54:36.641345490Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:54:36.667979 containerd[2021]: time="2025-09-05T23:54:36.667691814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6vhqh,Uid:08545de3-6592-4965-ae61-4807250e2870,Namespace:kube-system,Attempt:0,}" Sep 5 23:54:36.671688 containerd[2021]: time="2025-09-05T23:54:36.671277474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9f4498b8-wxd6q,Uid:a4bb29ad-8766-47fe-9303-a89374119066,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:36.675084 systemd[1]: Created slice kubepods-besteffort-podc05df7ab_c8cc_4af6_b78b_e2da00b65212.slice - libcontainer container kubepods-besteffort-podc05df7ab_c8cc_4af6_b78b_e2da00b65212.slice. Sep 5 23:54:36.686838 containerd[2021]: time="2025-09-05T23:54:36.686779506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xr256,Uid:c05df7ab-c8cc-4af6-b78b-e2da00b65212,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:36.873472 containerd[2021]: time="2025-09-05T23:54:36.872717071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bcc458d6d-445tw,Uid:91054442-1176-471c-b32d-508eba32633a,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:36.936127 containerd[2021]: time="2025-09-05T23:54:36.936049315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 5 23:54:37.269932 containerd[2021]: time="2025-09-05T23:54:37.269856749Z" level=error msg="Failed to destroy network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.275973 containerd[2021]: time="2025-09-05T23:54:37.275882777Z" level=error msg="encountered an error cleaning up failed sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.276134 containerd[2021]: time="2025-09-05T23:54:37.276009233Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kgk25,Uid:0851a585-ee51-4d36-80e5-364195a5c349,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.276716 kubelet[3339]: E0905 23:54:37.276581 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.276716 kubelet[3339]: E0905 23:54:37.276685 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kgk25" Sep 5 23:54:37.279128 kubelet[3339]: E0905 23:54:37.276721 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kgk25" Sep 5 23:54:37.279128 kubelet[3339]: E0905 23:54:37.276800 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-kgk25_kube-system(0851a585-ee51-4d36-80e5-364195a5c349)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-kgk25_kube-system(0851a585-ee51-4d36-80e5-364195a5c349)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-kgk25" podUID="0851a585-ee51-4d36-80e5-364195a5c349" Sep 5 23:54:37.277346 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5-shm.mount: Deactivated successfully. Sep 5 23:54:37.284198 containerd[2021]: time="2025-09-05T23:54:37.283580405Z" level=error msg="Failed to destroy network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.290510 containerd[2021]: time="2025-09-05T23:54:37.289209917Z" level=error msg="encountered an error cleaning up failed sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.290248 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233-shm.mount: Deactivated successfully. Sep 5 23:54:37.295594 containerd[2021]: time="2025-09-05T23:54:37.295362785Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-9q8qs,Uid:ad610cb3-9cf1-433c-bfe6-870f9da7a8a7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.297492 kubelet[3339]: E0905 23:54:37.295758 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.297492 kubelet[3339]: E0905 23:54:37.295839 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" Sep 5 23:54:37.297492 kubelet[3339]: E0905 23:54:37.295874 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" Sep 5 23:54:37.297764 kubelet[3339]: E0905 23:54:37.295936 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f77cc885-9q8qs_calico-apiserver(ad610cb3-9cf1-433c-bfe6-870f9da7a8a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f77cc885-9q8qs_calico-apiserver(ad610cb3-9cf1-433c-bfe6-870f9da7a8a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" podUID="ad610cb3-9cf1-433c-bfe6-870f9da7a8a7" Sep 5 23:54:37.298853 containerd[2021]: time="2025-09-05T23:54:37.298741037Z" level=error msg="Failed to destroy network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.307692 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9-shm.mount: Deactivated successfully. Sep 5 23:54:37.311540 containerd[2021]: time="2025-09-05T23:54:37.311445893Z" level=error msg="encountered an error cleaning up failed sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.315136 containerd[2021]: time="2025-09-05T23:54:37.314486153Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6vhqh,Uid:08545de3-6592-4965-ae61-4807250e2870,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.316549 kubelet[3339]: E0905 23:54:37.316003 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.317692 kubelet[3339]: E0905 23:54:37.316691 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6vhqh" Sep 5 23:54:37.317692 kubelet[3339]: E0905 23:54:37.317106 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6vhqh" Sep 5 23:54:37.319598 kubelet[3339]: E0905 23:54:37.318003 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-6vhqh_kube-system(08545de3-6592-4965-ae61-4807250e2870)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-6vhqh_kube-system(08545de3-6592-4965-ae61-4807250e2870)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-6vhqh" podUID="08545de3-6592-4965-ae61-4807250e2870" Sep 5 23:54:37.320015 containerd[2021]: time="2025-09-05T23:54:37.317299313Z" level=error msg="Failed to destroy network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.330697 containerd[2021]: time="2025-09-05T23:54:37.329911541Z" level=error msg="encountered an error cleaning up failed sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.330697 containerd[2021]: time="2025-09-05T23:54:37.330019049Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xr256,Uid:c05df7ab-c8cc-4af6-b78b-e2da00b65212,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.331471 containerd[2021]: time="2025-09-05T23:54:37.331417541Z" level=error msg="Failed to destroy network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.332010 kubelet[3339]: E0905 23:54:37.331753 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.332010 kubelet[3339]: E0905 23:54:37.331845 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:37.332010 kubelet[3339]: E0905 23:54:37.331883 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xr256" Sep 5 23:54:37.331942 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6-shm.mount: Deactivated successfully. Sep 5 23:54:37.336581 containerd[2021]: time="2025-09-05T23:54:37.336169241Z" level=error msg="encountered an error cleaning up failed sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.336581 containerd[2021]: time="2025-09-05T23:54:37.336289145Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-s9vkn,Uid:7eabc093-0edd-4719-902f-c28a617adb0c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.337425 kubelet[3339]: E0905 23:54:37.335961 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xr256_calico-system(c05df7ab-c8cc-4af6-b78b-e2da00b65212)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xr256_calico-system(c05df7ab-c8cc-4af6-b78b-e2da00b65212)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:37.340125 kubelet[3339]: E0905 23:54:37.340022 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.340125 kubelet[3339]: E0905 23:54:37.340105 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:37.340358 kubelet[3339]: E0905 23:54:37.340144 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-s9vkn" Sep 5 23:54:37.340358 kubelet[3339]: E0905 23:54:37.340214 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-s9vkn_calico-system(7eabc093-0edd-4719-902f-c28a617adb0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-s9vkn_calico-system(7eabc093-0edd-4719-902f-c28a617adb0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-s9vkn" podUID="7eabc093-0edd-4719-902f-c28a617adb0c" Sep 5 23:54:37.358086 containerd[2021]: time="2025-09-05T23:54:37.357992321Z" level=error msg="Failed to destroy network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.360107 containerd[2021]: time="2025-09-05T23:54:37.360026549Z" level=error msg="encountered an error cleaning up failed sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.360398 containerd[2021]: time="2025-09-05T23:54:37.360162317Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-km8wb,Uid:dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.360887 kubelet[3339]: E0905 23:54:37.360819 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.361121 kubelet[3339]: E0905 23:54:37.360911 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" Sep 5 23:54:37.361121 kubelet[3339]: E0905 23:54:37.360946 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" Sep 5 23:54:37.361121 kubelet[3339]: E0905 23:54:37.361033 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f77cc885-km8wb_calico-apiserver(dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f77cc885-km8wb_calico-apiserver(dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" podUID="dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea" Sep 5 23:54:37.377634 containerd[2021]: time="2025-09-05T23:54:37.377345009Z" level=error msg="Failed to destroy network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.378351 containerd[2021]: time="2025-09-05T23:54:37.378135929Z" level=error msg="encountered an error cleaning up failed sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.378351 containerd[2021]: time="2025-09-05T23:54:37.378212813Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9f4498b8-wxd6q,Uid:a4bb29ad-8766-47fe-9303-a89374119066,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.379359 kubelet[3339]: E0905 23:54:37.378739 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.379359 kubelet[3339]: E0905 23:54:37.378849 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" Sep 5 23:54:37.379359 kubelet[3339]: E0905 23:54:37.378918 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" Sep 5 23:54:37.380631 kubelet[3339]: E0905 23:54:37.379037 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5f9f4498b8-wxd6q_calico-system(a4bb29ad-8766-47fe-9303-a89374119066)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5f9f4498b8-wxd6q_calico-system(a4bb29ad-8766-47fe-9303-a89374119066)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" podUID="a4bb29ad-8766-47fe-9303-a89374119066" Sep 5 23:54:37.406333 containerd[2021]: time="2025-09-05T23:54:37.406230593Z" level=error msg="Failed to destroy network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.407260 containerd[2021]: time="2025-09-05T23:54:37.407155073Z" level=error msg="encountered an error cleaning up failed sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.407260 containerd[2021]: time="2025-09-05T23:54:37.407311133Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bcc458d6d-445tw,Uid:91054442-1176-471c-b32d-508eba32633a,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.408362 kubelet[3339]: E0905 23:54:37.407737 3339 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:37.408362 kubelet[3339]: E0905 23:54:37.407809 3339 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bcc458d6d-445tw" Sep 5 23:54:37.408362 kubelet[3339]: E0905 23:54:37.407841 3339 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bcc458d6d-445tw" Sep 5 23:54:37.408710 kubelet[3339]: E0905 23:54:37.407923 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5bcc458d6d-445tw_calico-system(91054442-1176-471c-b32d-508eba32633a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5bcc458d6d-445tw_calico-system(91054442-1176-471c-b32d-508eba32633a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bcc458d6d-445tw" podUID="91054442-1176-471c-b32d-508eba32633a" Sep 5 23:54:37.924734 kubelet[3339]: I0905 23:54:37.924688 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:37.928584 containerd[2021]: time="2025-09-05T23:54:37.927762236Z" level=info msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" Sep 5 23:54:37.928584 containerd[2021]: time="2025-09-05T23:54:37.928128704Z" level=info msg="Ensure that sandbox 95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6 in task-service has been cleanup successfully" Sep 5 23:54:37.929831 kubelet[3339]: I0905 23:54:37.929716 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:37.933234 containerd[2021]: time="2025-09-05T23:54:37.933171884Z" level=info msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" Sep 5 23:54:37.933974 containerd[2021]: time="2025-09-05T23:54:37.933929192Z" level=info msg="Ensure that sandbox f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f in task-service has been cleanup successfully" Sep 5 23:54:37.936957 kubelet[3339]: I0905 23:54:37.936880 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:37.941347 containerd[2021]: time="2025-09-05T23:54:37.941298476Z" level=info msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" Sep 5 23:54:37.942795 containerd[2021]: time="2025-09-05T23:54:37.942509792Z" level=info msg="Ensure that sandbox 8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5 in task-service has been cleanup successfully" Sep 5 23:54:37.946273 kubelet[3339]: I0905 23:54:37.946168 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:37.947780 containerd[2021]: time="2025-09-05T23:54:37.947377196Z" level=info msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" Sep 5 23:54:37.955229 containerd[2021]: time="2025-09-05T23:54:37.954958304Z" level=info msg="Ensure that sandbox b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9 in task-service has been cleanup successfully" Sep 5 23:54:37.956815 kubelet[3339]: I0905 23:54:37.956759 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:37.960613 containerd[2021]: time="2025-09-05T23:54:37.960393512Z" level=info msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" Sep 5 23:54:37.966617 containerd[2021]: time="2025-09-05T23:54:37.966471524Z" level=info msg="Ensure that sandbox 3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586 in task-service has been cleanup successfully" Sep 5 23:54:37.978315 kubelet[3339]: I0905 23:54:37.976488 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:37.990939 containerd[2021]: time="2025-09-05T23:54:37.990591908Z" level=info msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" Sep 5 23:54:37.993589 containerd[2021]: time="2025-09-05T23:54:37.993503228Z" level=info msg="Ensure that sandbox e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd in task-service has been cleanup successfully" Sep 5 23:54:38.006773 kubelet[3339]: I0905 23:54:38.006111 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:38.019925 containerd[2021]: time="2025-09-05T23:54:38.019843480Z" level=info msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" Sep 5 23:54:38.020343 containerd[2021]: time="2025-09-05T23:54:38.020274052Z" level=info msg="Ensure that sandbox b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233 in task-service has been cleanup successfully" Sep 5 23:54:38.085540 kubelet[3339]: I0905 23:54:38.084239 3339 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:38.090763 containerd[2021]: time="2025-09-05T23:54:38.090697493Z" level=info msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" Sep 5 23:54:38.091043 containerd[2021]: time="2025-09-05T23:54:38.090995381Z" level=info msg="Ensure that sandbox e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8 in task-service has been cleanup successfully" Sep 5 23:54:38.108944 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8-shm.mount: Deactivated successfully. Sep 5 23:54:38.109152 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586-shm.mount: Deactivated successfully. Sep 5 23:54:38.109288 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd-shm.mount: Deactivated successfully. Sep 5 23:54:38.109446 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f-shm.mount: Deactivated successfully. Sep 5 23:54:38.141264 containerd[2021]: time="2025-09-05T23:54:38.141177053Z" level=error msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" failed" error="failed to destroy network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.141631 kubelet[3339]: E0905 23:54:38.141485 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:38.141800 kubelet[3339]: E0905 23:54:38.141693 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6"} Sep 5 23:54:38.141883 kubelet[3339]: E0905 23:54:38.141846 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.141998 kubelet[3339]: E0905 23:54:38.141888 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c05df7ab-c8cc-4af6-b78b-e2da00b65212\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xr256" podUID="c05df7ab-c8cc-4af6-b78b-e2da00b65212" Sep 5 23:54:38.166628 containerd[2021]: time="2025-09-05T23:54:38.164563325Z" level=error msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" failed" error="failed to destroy network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.167705 kubelet[3339]: E0905 23:54:38.167622 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:38.167835 kubelet[3339]: E0905 23:54:38.167704 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f"} Sep 5 23:54:38.167835 kubelet[3339]: E0905 23:54:38.167766 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7eabc093-0edd-4719-902f-c28a617adb0c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.168020 kubelet[3339]: E0905 23:54:38.167823 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7eabc093-0edd-4719-902f-c28a617adb0c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-s9vkn" podUID="7eabc093-0edd-4719-902f-c28a617adb0c" Sep 5 23:54:38.200848 containerd[2021]: time="2025-09-05T23:54:38.199807253Z" level=error msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" failed" error="failed to destroy network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.201799 kubelet[3339]: E0905 23:54:38.200787 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:38.201799 kubelet[3339]: E0905 23:54:38.201747 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5"} Sep 5 23:54:38.201984 kubelet[3339]: E0905 23:54:38.201899 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0851a585-ee51-4d36-80e5-364195a5c349\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.202115 kubelet[3339]: E0905 23:54:38.202012 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0851a585-ee51-4d36-80e5-364195a5c349\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-kgk25" podUID="0851a585-ee51-4d36-80e5-364195a5c349" Sep 5 23:54:38.213876 containerd[2021]: time="2025-09-05T23:54:38.213770141Z" level=error msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" failed" error="failed to destroy network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.214594 kubelet[3339]: E0905 23:54:38.214333 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:38.214594 kubelet[3339]: E0905 23:54:38.214405 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586"} Sep 5 23:54:38.214594 kubelet[3339]: E0905 23:54:38.214459 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a4bb29ad-8766-47fe-9303-a89374119066\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.215430 kubelet[3339]: E0905 23:54:38.214498 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a4bb29ad-8766-47fe-9303-a89374119066\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" podUID="a4bb29ad-8766-47fe-9303-a89374119066" Sep 5 23:54:38.225938 containerd[2021]: time="2025-09-05T23:54:38.225857225Z" level=error msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" failed" error="failed to destroy network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.226221 kubelet[3339]: E0905 23:54:38.226154 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:38.226307 kubelet[3339]: E0905 23:54:38.226233 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233"} Sep 5 23:54:38.226307 kubelet[3339]: E0905 23:54:38.226296 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.226491 kubelet[3339]: E0905 23:54:38.226337 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" podUID="ad610cb3-9cf1-433c-bfe6-870f9da7a8a7" Sep 5 23:54:38.226854 containerd[2021]: time="2025-09-05T23:54:38.226651301Z" level=error msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" failed" error="failed to destroy network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.227336 kubelet[3339]: E0905 23:54:38.227023 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:38.227336 kubelet[3339]: E0905 23:54:38.227083 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8"} Sep 5 23:54:38.227336 kubelet[3339]: E0905 23:54:38.227150 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"91054442-1176-471c-b32d-508eba32633a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.227336 kubelet[3339]: E0905 23:54:38.227191 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"91054442-1176-471c-b32d-508eba32633a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bcc458d6d-445tw" podUID="91054442-1176-471c-b32d-508eba32633a" Sep 5 23:54:38.231162 containerd[2021]: time="2025-09-05T23:54:38.230947949Z" level=error msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" failed" error="failed to destroy network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.232547 kubelet[3339]: E0905 23:54:38.231512 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:38.232547 kubelet[3339]: E0905 23:54:38.231621 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9"} Sep 5 23:54:38.232547 kubelet[3339]: E0905 23:54:38.231674 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"08545de3-6592-4965-ae61-4807250e2870\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.232547 kubelet[3339]: E0905 23:54:38.231713 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"08545de3-6592-4965-ae61-4807250e2870\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-6vhqh" podUID="08545de3-6592-4965-ae61-4807250e2870" Sep 5 23:54:38.250024 containerd[2021]: time="2025-09-05T23:54:38.249439314Z" level=error msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" failed" error="failed to destroy network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:54:38.250156 kubelet[3339]: E0905 23:54:38.249781 3339 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:38.250156 kubelet[3339]: E0905 23:54:38.249844 3339 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd"} Sep 5 23:54:38.250156 kubelet[3339]: E0905 23:54:38.249897 3339 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:54:38.250156 kubelet[3339]: E0905 23:54:38.249942 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" podUID="dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea" Sep 5 23:54:43.178448 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3248497456.mount: Deactivated successfully. Sep 5 23:54:43.243280 containerd[2021]: time="2025-09-05T23:54:43.242694250Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:43.245805 containerd[2021]: time="2025-09-05T23:54:43.245723566Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 5 23:54:43.249571 containerd[2021]: time="2025-09-05T23:54:43.248448370Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:43.253192 containerd[2021]: time="2025-09-05T23:54:43.253129210Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:43.254693 containerd[2021]: time="2025-09-05T23:54:43.254620438Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 6.318394231s" Sep 5 23:54:43.254866 containerd[2021]: time="2025-09-05T23:54:43.254696554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 5 23:54:43.298915 containerd[2021]: time="2025-09-05T23:54:43.298833887Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 5 23:54:43.363082 containerd[2021]: time="2025-09-05T23:54:43.362973395Z" level=info msg="CreateContainer within sandbox \"013e02d5dcd76fcb0086826df17c0ce2839a6cbed4114e4658dfe3091adbd74e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b\"" Sep 5 23:54:43.365291 containerd[2021]: time="2025-09-05T23:54:43.365160083Z" level=info msg="StartContainer for \"9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b\"" Sep 5 23:54:43.437576 systemd[1]: Started cri-containerd-9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b.scope - libcontainer container 9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b. Sep 5 23:54:43.521369 containerd[2021]: time="2025-09-05T23:54:43.521308812Z" level=info msg="StartContainer for \"9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b\" returns successfully" Sep 5 23:54:43.771499 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 5 23:54:43.771763 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 5 23:54:43.987300 containerd[2021]: time="2025-09-05T23:54:43.987220922Z" level=info msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" Sep 5 23:54:44.241001 kubelet[3339]: I0905 23:54:44.235532 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-476k5" podStartSLOduration=1.760640231 podStartE2EDuration="17.235485179s" podCreationTimestamp="2025-09-05 23:54:27 +0000 UTC" firstStartedPulling="2025-09-05 23:54:27.782189146 +0000 UTC m=+31.363251769" lastFinishedPulling="2025-09-05 23:54:43.257034106 +0000 UTC m=+46.838096717" observedRunningTime="2025-09-05 23:54:44.202768235 +0000 UTC m=+47.783830942" watchObservedRunningTime="2025-09-05 23:54:44.235485179 +0000 UTC m=+47.816547802" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.248 [INFO][4581] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.249 [INFO][4581] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" iface="eth0" netns="/var/run/netns/cni-75a2e824-999d-8483-4396-d3fc25d361d5" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.251 [INFO][4581] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" iface="eth0" netns="/var/run/netns/cni-75a2e824-999d-8483-4396-d3fc25d361d5" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.255 [INFO][4581] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" iface="eth0" netns="/var/run/netns/cni-75a2e824-999d-8483-4396-d3fc25d361d5" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.255 [INFO][4581] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.255 [INFO][4581] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.391 [INFO][4609] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.392 [INFO][4609] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.393 [INFO][4609] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.416 [WARNING][4609] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.417 [INFO][4609] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.423 [INFO][4609] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:44.437654 containerd[2021]: 2025-09-05 23:54:44.433 [INFO][4581] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:44.443313 containerd[2021]: time="2025-09-05T23:54:44.437880612Z" level=info msg="TearDown network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" successfully" Sep 5 23:54:44.443313 containerd[2021]: time="2025-09-05T23:54:44.437923452Z" level=info msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" returns successfully" Sep 5 23:54:44.445106 systemd[1]: run-netns-cni\x2d75a2e824\x2d999d\x2d8483\x2d4396\x2dd3fc25d361d5.mount: Deactivated successfully. Sep 5 23:54:44.524568 kubelet[3339]: I0905 23:54:44.524085 3339 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/91054442-1176-471c-b32d-508eba32633a-whisker-backend-key-pair\") pod \"91054442-1176-471c-b32d-508eba32633a\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " Sep 5 23:54:44.524568 kubelet[3339]: I0905 23:54:44.524164 3339 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91054442-1176-471c-b32d-508eba32633a-whisker-ca-bundle\") pod \"91054442-1176-471c-b32d-508eba32633a\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " Sep 5 23:54:44.524568 kubelet[3339]: I0905 23:54:44.524320 3339 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rsp\" (UniqueName: \"kubernetes.io/projected/91054442-1176-471c-b32d-508eba32633a-kube-api-access-69rsp\") pod \"91054442-1176-471c-b32d-508eba32633a\" (UID: \"91054442-1176-471c-b32d-508eba32633a\") " Sep 5 23:54:44.528564 kubelet[3339]: I0905 23:54:44.528124 3339 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91054442-1176-471c-b32d-508eba32633a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "91054442-1176-471c-b32d-508eba32633a" (UID: "91054442-1176-471c-b32d-508eba32633a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 5 23:54:44.533688 kubelet[3339]: I0905 23:54:44.532801 3339 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91054442-1176-471c-b32d-508eba32633a-kube-api-access-69rsp" (OuterVolumeSpecName: "kube-api-access-69rsp") pod "91054442-1176-471c-b32d-508eba32633a" (UID: "91054442-1176-471c-b32d-508eba32633a"). InnerVolumeSpecName "kube-api-access-69rsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 5 23:54:44.537452 systemd[1]: var-lib-kubelet-pods-91054442\x2d1176\x2d471c\x2db32d\x2d508eba32633a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d69rsp.mount: Deactivated successfully. Sep 5 23:54:44.543893 systemd[1]: var-lib-kubelet-pods-91054442\x2d1176\x2d471c\x2db32d\x2d508eba32633a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 5 23:54:44.544422 kubelet[3339]: I0905 23:54:44.544348 3339 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91054442-1176-471c-b32d-508eba32633a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "91054442-1176-471c-b32d-508eba32633a" (UID: "91054442-1176-471c-b32d-508eba32633a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 5 23:54:44.624791 kubelet[3339]: I0905 23:54:44.624667 3339 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rsp\" (UniqueName: \"kubernetes.io/projected/91054442-1176-471c-b32d-508eba32633a-kube-api-access-69rsp\") on node \"ip-172-31-18-129\" DevicePath \"\"" Sep 5 23:54:44.624791 kubelet[3339]: I0905 23:54:44.624724 3339 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/91054442-1176-471c-b32d-508eba32633a-whisker-backend-key-pair\") on node \"ip-172-31-18-129\" DevicePath \"\"" Sep 5 23:54:44.624791 kubelet[3339]: I0905 23:54:44.624754 3339 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91054442-1176-471c-b32d-508eba32633a-whisker-ca-bundle\") on node \"ip-172-31-18-129\" DevicePath \"\"" Sep 5 23:54:44.669745 systemd[1]: Removed slice kubepods-besteffort-pod91054442_1176_471c_b32d_508eba32633a.slice - libcontainer container kubepods-besteffort-pod91054442_1176_471c_b32d_508eba32633a.slice. Sep 5 23:54:45.272821 systemd[1]: Created slice kubepods-besteffort-pode10c3009_95ec_4fc2_8f2e_e788b81ab6e5.slice - libcontainer container kubepods-besteffort-pode10c3009_95ec_4fc2_8f2e_e788b81ab6e5.slice. Sep 5 23:54:45.329911 kubelet[3339]: I0905 23:54:45.329830 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e10c3009-95ec-4fc2-8f2e-e788b81ab6e5-whisker-backend-key-pair\") pod \"whisker-745fdbc6d7-rz6db\" (UID: \"e10c3009-95ec-4fc2-8f2e-e788b81ab6e5\") " pod="calico-system/whisker-745fdbc6d7-rz6db" Sep 5 23:54:45.329911 kubelet[3339]: I0905 23:54:45.329914 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj85f\" (UniqueName: \"kubernetes.io/projected/e10c3009-95ec-4fc2-8f2e-e788b81ab6e5-kube-api-access-gj85f\") pod \"whisker-745fdbc6d7-rz6db\" (UID: \"e10c3009-95ec-4fc2-8f2e-e788b81ab6e5\") " pod="calico-system/whisker-745fdbc6d7-rz6db" Sep 5 23:54:45.330557 kubelet[3339]: I0905 23:54:45.329969 3339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e10c3009-95ec-4fc2-8f2e-e788b81ab6e5-whisker-ca-bundle\") pod \"whisker-745fdbc6d7-rz6db\" (UID: \"e10c3009-95ec-4fc2-8f2e-e788b81ab6e5\") " pod="calico-system/whisker-745fdbc6d7-rz6db" Sep 5 23:54:45.583301 containerd[2021]: time="2025-09-05T23:54:45.582373238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-745fdbc6d7-rz6db,Uid:e10c3009-95ec-4fc2-8f2e-e788b81ab6e5,Namespace:calico-system,Attempt:0,}" Sep 5 23:54:45.869590 (udev-worker)[4560]: Network interface NamePolicy= disabled on kernel command line. Sep 5 23:54:45.872751 systemd-networkd[1930]: cali276b13c3c74: Link UP Sep 5 23:54:45.873267 systemd-networkd[1930]: cali276b13c3c74: Gained carrier Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.672 [INFO][4653] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.706 [INFO][4653] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0 whisker-745fdbc6d7- calico-system e10c3009-95ec-4fc2-8f2e-e788b81ab6e5 901 0 2025-09-05 23:54:45 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:745fdbc6d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-18-129 whisker-745fdbc6d7-rz6db eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali276b13c3c74 [] [] }} ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.706 [INFO][4653] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.778 [INFO][4692] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" HandleID="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Workload="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.781 [INFO][4692] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" HandleID="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Workload="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b900), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-18-129", "pod":"whisker-745fdbc6d7-rz6db", "timestamp":"2025-09-05 23:54:45.778926555 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.781 [INFO][4692] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.782 [INFO][4692] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.782 [INFO][4692] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.797 [INFO][4692] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.808 [INFO][4692] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.816 [INFO][4692] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.819 [INFO][4692] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.823 [INFO][4692] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.823 [INFO][4692] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.826 [INFO][4692] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.834 [INFO][4692] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.846 [INFO][4692] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.193/26] block=192.168.105.192/26 handle="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.847 [INFO][4692] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.193/26] handle="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" host="ip-172-31-18-129" Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.847 [INFO][4692] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:45.916470 containerd[2021]: 2025-09-05 23:54:45.847 [INFO][4692] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.193/26] IPv6=[] ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" HandleID="k8s-pod-network.728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Workload="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.853 [INFO][4653] cni-plugin/k8s.go 418: Populated endpoint ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0", GenerateName:"whisker-745fdbc6d7-", Namespace:"calico-system", SelfLink:"", UID:"e10c3009-95ec-4fc2-8f2e-e788b81ab6e5", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"745fdbc6d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"whisker-745fdbc6d7-rz6db", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.105.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali276b13c3c74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.853 [INFO][4653] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.193/32] ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.853 [INFO][4653] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali276b13c3c74 ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.876 [INFO][4653] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.878 [INFO][4653] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0", GenerateName:"whisker-745fdbc6d7-", Namespace:"calico-system", SelfLink:"", UID:"e10c3009-95ec-4fc2-8f2e-e788b81ab6e5", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"745fdbc6d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e", Pod:"whisker-745fdbc6d7-rz6db", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.105.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali276b13c3c74", MAC:"92:b9:a4:a9:46:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:45.917977 containerd[2021]: 2025-09-05 23:54:45.905 [INFO][4653] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e" Namespace="calico-system" Pod="whisker-745fdbc6d7-rz6db" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--745fdbc6d7--rz6db-eth0" Sep 5 23:54:45.958991 containerd[2021]: time="2025-09-05T23:54:45.958778824Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:45.958991 containerd[2021]: time="2025-09-05T23:54:45.958921048Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:45.963539 containerd[2021]: time="2025-09-05T23:54:45.958959904Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:45.967821 containerd[2021]: time="2025-09-05T23:54:45.966704440Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:46.037097 systemd[1]: Started cri-containerd-728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e.scope - libcontainer container 728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e. Sep 5 23:54:46.208354 containerd[2021]: time="2025-09-05T23:54:46.204211021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-745fdbc6d7-rz6db,Uid:e10c3009-95ec-4fc2-8f2e-e788b81ab6e5,Namespace:calico-system,Attempt:0,} returns sandbox id \"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e\"" Sep 5 23:54:46.239246 containerd[2021]: time="2025-09-05T23:54:46.235790017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 5 23:54:46.666719 kubelet[3339]: I0905 23:54:46.666596 3339 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91054442-1176-471c-b32d-508eba32633a" path="/var/lib/kubelet/pods/91054442-1176-471c-b32d-508eba32633a/volumes" Sep 5 23:54:46.811581 kernel: bpftool[4860]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 5 23:54:47.151395 systemd-networkd[1930]: vxlan.calico: Link UP Sep 5 23:54:47.151411 systemd-networkd[1930]: vxlan.calico: Gained carrier Sep 5 23:54:47.165767 systemd-networkd[1930]: cali276b13c3c74: Gained IPv6LL Sep 5 23:54:47.219452 (udev-worker)[4562]: Network interface NamePolicy= disabled on kernel command line. Sep 5 23:54:48.092184 containerd[2021]: time="2025-09-05T23:54:48.089795006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:48.093098 containerd[2021]: time="2025-09-05T23:54:48.093033770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 5 23:54:48.095243 containerd[2021]: time="2025-09-05T23:54:48.095186882Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:48.105442 containerd[2021]: time="2025-09-05T23:54:48.105329474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:48.109094 containerd[2021]: time="2025-09-05T23:54:48.109000898Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.873139529s" Sep 5 23:54:48.109480 containerd[2021]: time="2025-09-05T23:54:48.109436726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 5 23:54:48.140791 containerd[2021]: time="2025-09-05T23:54:48.140308671Z" level=info msg="CreateContainer within sandbox \"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 5 23:54:48.201978 containerd[2021]: time="2025-09-05T23:54:48.201915267Z" level=info msg="CreateContainer within sandbox \"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"042757e9ac95c2cbc4789083942b2be60b269bd52dad084fcf1556e624025208\"" Sep 5 23:54:48.203399 containerd[2021]: time="2025-09-05T23:54:48.203238447Z" level=info msg="StartContainer for \"042757e9ac95c2cbc4789083942b2be60b269bd52dad084fcf1556e624025208\"" Sep 5 23:54:48.253899 systemd-networkd[1930]: vxlan.calico: Gained IPv6LL Sep 5 23:54:48.279899 systemd[1]: Started cri-containerd-042757e9ac95c2cbc4789083942b2be60b269bd52dad084fcf1556e624025208.scope - libcontainer container 042757e9ac95c2cbc4789083942b2be60b269bd52dad084fcf1556e624025208. Sep 5 23:54:48.354143 containerd[2021]: time="2025-09-05T23:54:48.353968492Z" level=info msg="StartContainer for \"042757e9ac95c2cbc4789083942b2be60b269bd52dad084fcf1556e624025208\" returns successfully" Sep 5 23:54:48.356802 containerd[2021]: time="2025-09-05T23:54:48.356578300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 5 23:54:49.665591 containerd[2021]: time="2025-09-05T23:54:49.665427234Z" level=info msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.959 [INFO][4988] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.959 [INFO][4988] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" iface="eth0" netns="/var/run/netns/cni-ee44fd56-932a-21b2-3fa9-ebc843e2959d" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.960 [INFO][4988] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" iface="eth0" netns="/var/run/netns/cni-ee44fd56-932a-21b2-3fa9-ebc843e2959d" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.962 [INFO][4988] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" iface="eth0" netns="/var/run/netns/cni-ee44fd56-932a-21b2-3fa9-ebc843e2959d" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.962 [INFO][4988] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:49.962 [INFO][4988] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.020 [INFO][4995] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.020 [INFO][4995] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.020 [INFO][4995] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.037 [WARNING][4995] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.038 [INFO][4995] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.042 [INFO][4995] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:50.053044 containerd[2021]: 2025-09-05 23:54:50.047 [INFO][4988] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:50.056441 containerd[2021]: time="2025-09-05T23:54:50.055392448Z" level=info msg="TearDown network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" successfully" Sep 5 23:54:50.056441 containerd[2021]: time="2025-09-05T23:54:50.055442392Z" level=info msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" returns successfully" Sep 5 23:54:50.059017 containerd[2021]: time="2025-09-05T23:54:50.057983644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-km8wb,Uid:dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea,Namespace:calico-apiserver,Attempt:1,}" Sep 5 23:54:50.061249 systemd[1]: run-netns-cni\x2dee44fd56\x2d932a\x2d21b2\x2d3fa9\x2debc843e2959d.mount: Deactivated successfully. Sep 5 23:54:50.439112 systemd-networkd[1930]: cali869289ca11c: Link UP Sep 5 23:54:50.444096 systemd-networkd[1930]: cali869289ca11c: Gained carrier Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.213 [INFO][5002] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0 calico-apiserver-5f77cc885- calico-apiserver dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea 923 0 2025-09-05 23:54:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f77cc885 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-18-129 calico-apiserver-5f77cc885-km8wb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali869289ca11c [] [] }} ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.214 [INFO][5002] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.325 [INFO][5015] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" HandleID="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.329 [INFO][5015] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" HandleID="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c680), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-18-129", "pod":"calico-apiserver-5f77cc885-km8wb", "timestamp":"2025-09-05 23:54:50.325439657 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.330 [INFO][5015] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.330 [INFO][5015] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.331 [INFO][5015] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.356 [INFO][5015] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.366 [INFO][5015] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.374 [INFO][5015] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.378 [INFO][5015] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.383 [INFO][5015] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.383 [INFO][5015] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.386 [INFO][5015] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.394 [INFO][5015] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.413 [INFO][5015] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.194/26] block=192.168.105.192/26 handle="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.413 [INFO][5015] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.194/26] handle="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" host="ip-172-31-18-129" Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.413 [INFO][5015] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:50.492653 containerd[2021]: 2025-09-05 23:54:50.414 [INFO][5015] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.194/26] IPv6=[] ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" HandleID="k8s-pod-network.0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.428 [INFO][5002] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"calico-apiserver-5f77cc885-km8wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali869289ca11c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.429 [INFO][5002] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.194/32] ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.429 [INFO][5002] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali869289ca11c ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.447 [INFO][5002] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.448 [INFO][5002] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd", Pod:"calico-apiserver-5f77cc885-km8wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali869289ca11c", MAC:"da:71:3c:86:0a:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:50.495979 containerd[2021]: 2025-09-05 23:54:50.481 [INFO][5002] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-km8wb" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:50.551210 containerd[2021]: time="2025-09-05T23:54:50.550737799Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:50.554053 containerd[2021]: time="2025-09-05T23:54:50.553331767Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:50.554053 containerd[2021]: time="2025-09-05T23:54:50.553389007Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:50.554053 containerd[2021]: time="2025-09-05T23:54:50.553577803Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:50.608301 systemd[1]: Started cri-containerd-0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd.scope - libcontainer container 0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd. Sep 5 23:54:50.657230 containerd[2021]: time="2025-09-05T23:54:50.656703907Z" level=info msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" Sep 5 23:54:50.799180 containerd[2021]: time="2025-09-05T23:54:50.799114172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-km8wb,Uid:dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd\"" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.818 [INFO][5074] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.820 [INFO][5074] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" iface="eth0" netns="/var/run/netns/cni-6c483a18-399c-bc2f-a919-68df9c7d6838" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.822 [INFO][5074] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" iface="eth0" netns="/var/run/netns/cni-6c483a18-399c-bc2f-a919-68df9c7d6838" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.823 [INFO][5074] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" iface="eth0" netns="/var/run/netns/cni-6c483a18-399c-bc2f-a919-68df9c7d6838" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.825 [INFO][5074] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.825 [INFO][5074] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.872 [INFO][5086] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.872 [INFO][5086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.873 [INFO][5086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.892 [WARNING][5086] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.893 [INFO][5086] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.898 [INFO][5086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:50.913152 containerd[2021]: 2025-09-05 23:54:50.904 [INFO][5074] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:54:50.917012 containerd[2021]: time="2025-09-05T23:54:50.916705232Z" level=info msg="TearDown network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" successfully" Sep 5 23:54:50.917345 containerd[2021]: time="2025-09-05T23:54:50.916754288Z" level=info msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" returns successfully" Sep 5 23:54:50.921091 containerd[2021]: time="2025-09-05T23:54:50.921024788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6vhqh,Uid:08545de3-6592-4965-ae61-4807250e2870,Namespace:kube-system,Attempt:1,}" Sep 5 23:54:50.925467 systemd[1]: run-netns-cni\x2d6c483a18\x2d399c\x2dbc2f\x2da919\x2d68df9c7d6838.mount: Deactivated successfully. Sep 5 23:54:51.443924 systemd-networkd[1930]: cali0a6a1f850df: Link UP Sep 5 23:54:51.444361 systemd-networkd[1930]: cali0a6a1f850df: Gained carrier Sep 5 23:54:51.460652 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1148042389.mount: Deactivated successfully. Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.210 [INFO][5092] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0 coredns-7c65d6cfc9- kube-system 08545de3-6592-4965-ae61-4807250e2870 929 0 2025-09-05 23:54:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-18-129 coredns-7c65d6cfc9-6vhqh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0a6a1f850df [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.210 [INFO][5092] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.326 [INFO][5104] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" HandleID="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.326 [INFO][5104] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" HandleID="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000385870), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-18-129", "pod":"coredns-7c65d6cfc9-6vhqh", "timestamp":"2025-09-05 23:54:51.325724982 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.328 [INFO][5104] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.329 [INFO][5104] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.329 [INFO][5104] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.350 [INFO][5104] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.360 [INFO][5104] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.370 [INFO][5104] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.373 [INFO][5104] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.386 [INFO][5104] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.386 [INFO][5104] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.392 [INFO][5104] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.409 [INFO][5104] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.425 [INFO][5104] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.195/26] block=192.168.105.192/26 handle="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.426 [INFO][5104] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.195/26] handle="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" host="ip-172-31-18-129" Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.427 [INFO][5104] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:51.489507 containerd[2021]: 2025-09-05 23:54:51.428 [INFO][5104] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.195/26] IPv6=[] ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" HandleID="k8s-pod-network.307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.434 [INFO][5092] cni-plugin/k8s.go 418: Populated endpoint ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"08545de3-6592-4965-ae61-4807250e2870", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"coredns-7c65d6cfc9-6vhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a6a1f850df", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.434 [INFO][5092] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.195/32] ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.434 [INFO][5092] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0a6a1f850df ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.443 [INFO][5092] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.447 [INFO][5092] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"08545de3-6592-4965-ae61-4807250e2870", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae", Pod:"coredns-7c65d6cfc9-6vhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a6a1f850df", MAC:"fa:ac:b3:15:7f:c7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:51.493536 containerd[2021]: 2025-09-05 23:54:51.483 [INFO][5092] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6vhqh" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:54:51.521906 containerd[2021]: time="2025-09-05T23:54:51.521832787Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:51.528770 containerd[2021]: time="2025-09-05T23:54:51.528433675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 5 23:54:51.530099 containerd[2021]: time="2025-09-05T23:54:51.530047111Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:51.557455 containerd[2021]: time="2025-09-05T23:54:51.556126700Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:51.563321 containerd[2021]: time="2025-09-05T23:54:51.562776848Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 3.206116228s" Sep 5 23:54:51.563644 containerd[2021]: time="2025-09-05T23:54:51.563569904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 5 23:54:51.568018 containerd[2021]: time="2025-09-05T23:54:51.565405784Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:51.568018 containerd[2021]: time="2025-09-05T23:54:51.565483184Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:51.568018 containerd[2021]: time="2025-09-05T23:54:51.565508048Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:51.568018 containerd[2021]: time="2025-09-05T23:54:51.566585216Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:51.571995 containerd[2021]: time="2025-09-05T23:54:51.571917944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 23:54:51.573888 containerd[2021]: time="2025-09-05T23:54:51.573610640Z" level=info msg="CreateContainer within sandbox \"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 5 23:54:51.618900 systemd[1]: Started cri-containerd-307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae.scope - libcontainer container 307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae. Sep 5 23:54:51.623246 containerd[2021]: time="2025-09-05T23:54:51.623179592Z" level=info msg="CreateContainer within sandbox \"728a73f6c9c815c22cf57d1a2f8ff6f6ac7f65d5e47a12cf3f20e792fa5bc48e\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"dc3b89d1ec244be147b2b70fc3ba005bffb883bfa9cb3862edf39c83335b8b89\"" Sep 5 23:54:51.624629 containerd[2021]: time="2025-09-05T23:54:51.624585464Z" level=info msg="StartContainer for \"dc3b89d1ec244be147b2b70fc3ba005bffb883bfa9cb3862edf39c83335b8b89\"" Sep 5 23:54:51.655060 containerd[2021]: time="2025-09-05T23:54:51.655008956Z" level=info msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" Sep 5 23:54:51.660248 containerd[2021]: time="2025-09-05T23:54:51.658303052Z" level=info msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" Sep 5 23:54:51.798140 systemd[1]: Started cri-containerd-dc3b89d1ec244be147b2b70fc3ba005bffb883bfa9cb3862edf39c83335b8b89.scope - libcontainer container dc3b89d1ec244be147b2b70fc3ba005bffb883bfa9cb3862edf39c83335b8b89. Sep 5 23:54:51.821244 containerd[2021]: time="2025-09-05T23:54:51.821188545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6vhqh,Uid:08545de3-6592-4965-ae61-4807250e2870,Namespace:kube-system,Attempt:1,} returns sandbox id \"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae\"" Sep 5 23:54:51.840614 containerd[2021]: time="2025-09-05T23:54:51.839727501Z" level=info msg="CreateContainer within sandbox \"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 23:54:51.914636 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3816148682.mount: Deactivated successfully. Sep 5 23:54:51.945936 containerd[2021]: time="2025-09-05T23:54:51.945841150Z" level=info msg="CreateContainer within sandbox \"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5b714c7bd508d2ba626f2810867e5a8a2b44f6d43b8bff32fb91ac6f9b3a1016\"" Sep 5 23:54:51.948428 containerd[2021]: time="2025-09-05T23:54:51.948365218Z" level=info msg="StartContainer for \"5b714c7bd508d2ba626f2810867e5a8a2b44f6d43b8bff32fb91ac6f9b3a1016\"" Sep 5 23:54:52.030931 systemd-networkd[1930]: cali869289ca11c: Gained IPv6LL Sep 5 23:54:52.039589 containerd[2021]: time="2025-09-05T23:54:52.039477078Z" level=info msg="StartContainer for \"dc3b89d1ec244be147b2b70fc3ba005bffb883bfa9cb3862edf39c83335b8b89\" returns successfully" Sep 5 23:54:52.043600 systemd[1]: Started cri-containerd-5b714c7bd508d2ba626f2810867e5a8a2b44f6d43b8bff32fb91ac6f9b3a1016.scope - libcontainer container 5b714c7bd508d2ba626f2810867e5a8a2b44f6d43b8bff32fb91ac6f9b3a1016. Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.855 [INFO][5187] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.856 [INFO][5187] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" iface="eth0" netns="/var/run/netns/cni-c386f7e0-e934-14a2-d8cd-b8fe5d3ba5a0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.857 [INFO][5187] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" iface="eth0" netns="/var/run/netns/cni-c386f7e0-e934-14a2-d8cd-b8fe5d3ba5a0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.858 [INFO][5187] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" iface="eth0" netns="/var/run/netns/cni-c386f7e0-e934-14a2-d8cd-b8fe5d3ba5a0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.858 [INFO][5187] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:51.858 [INFO][5187] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.021 [INFO][5220] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.021 [INFO][5220] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.022 [INFO][5220] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.066 [WARNING][5220] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.066 [INFO][5220] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.080 [INFO][5220] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:52.104056 containerd[2021]: 2025-09-05 23:54:52.090 [INFO][5187] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:52.107983 containerd[2021]: time="2025-09-05T23:54:52.104487354Z" level=info msg="TearDown network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" successfully" Sep 5 23:54:52.107983 containerd[2021]: time="2025-09-05T23:54:52.104647110Z" level=info msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" returns successfully" Sep 5 23:54:52.110767 containerd[2021]: time="2025-09-05T23:54:52.110711706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-s9vkn,Uid:7eabc093-0edd-4719-902f-c28a617adb0c,Namespace:calico-system,Attempt:1,}" Sep 5 23:54:52.157561 containerd[2021]: time="2025-09-05T23:54:52.156143455Z" level=info msg="StartContainer for \"5b714c7bd508d2ba626f2810867e5a8a2b44f6d43b8bff32fb91ac6f9b3a1016\" returns successfully" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.936 [INFO][5183] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.936 [INFO][5183] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" iface="eth0" netns="/var/run/netns/cni-c6f9b71d-20f9-16e1-1312-10b6f2c47513" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.937 [INFO][5183] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" iface="eth0" netns="/var/run/netns/cni-c6f9b71d-20f9-16e1-1312-10b6f2c47513" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.941 [INFO][5183] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" iface="eth0" netns="/var/run/netns/cni-c6f9b71d-20f9-16e1-1312-10b6f2c47513" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.943 [INFO][5183] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:51.944 [INFO][5183] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.118 [INFO][5229] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.118 [INFO][5229] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.118 [INFO][5229] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.142 [WARNING][5229] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.143 [INFO][5229] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.145 [INFO][5229] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:52.163067 containerd[2021]: 2025-09-05 23:54:52.154 [INFO][5183] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:52.167651 containerd[2021]: time="2025-09-05T23:54:52.167122231Z" level=info msg="TearDown network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" successfully" Sep 5 23:54:52.167651 containerd[2021]: time="2025-09-05T23:54:52.167190895Z" level=info msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" returns successfully" Sep 5 23:54:52.175443 containerd[2021]: time="2025-09-05T23:54:52.175351123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xr256,Uid:c05df7ab-c8cc-4af6-b78b-e2da00b65212,Namespace:calico-system,Attempt:1,}" Sep 5 23:54:52.353715 kubelet[3339]: I0905 23:54:52.351429 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-745fdbc6d7-rz6db" podStartSLOduration=2.008826853 podStartE2EDuration="7.35140592s" podCreationTimestamp="2025-09-05 23:54:45 +0000 UTC" firstStartedPulling="2025-09-05 23:54:46.224099845 +0000 UTC m=+49.805162456" lastFinishedPulling="2025-09-05 23:54:51.566678912 +0000 UTC m=+55.147741523" observedRunningTime="2025-09-05 23:54:52.263306011 +0000 UTC m=+55.844368646" watchObservedRunningTime="2025-09-05 23:54:52.35140592 +0000 UTC m=+55.932468543" Sep 5 23:54:52.641248 systemd-networkd[1930]: calib54d883f6ab: Link UP Sep 5 23:54:52.649400 systemd-networkd[1930]: calib54d883f6ab: Gained carrier Sep 5 23:54:52.679291 containerd[2021]: time="2025-09-05T23:54:52.679123569Z" level=info msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" Sep 5 23:54:52.689913 containerd[2021]: time="2025-09-05T23:54:52.689689809Z" level=info msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" Sep 5 23:54:52.690828 systemd[1]: run-netns-cni\x2dc6f9b71d\x2d20f9\x2d16e1\x2d1312\x2d10b6f2c47513.mount: Deactivated successfully. Sep 5 23:54:52.691066 systemd[1]: run-netns-cni\x2dc386f7e0\x2de934\x2d14a2\x2dd8cd\x2db8fe5d3ba5a0.mount: Deactivated successfully. Sep 5 23:54:52.755785 kubelet[3339]: I0905 23:54:52.755273 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-6vhqh" podStartSLOduration=50.755248378 podStartE2EDuration="50.755248378s" podCreationTimestamp="2025-09-05 23:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:54:52.359784032 +0000 UTC m=+55.940846655" watchObservedRunningTime="2025-09-05 23:54:52.755248378 +0000 UTC m=+56.336311001" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.384 [INFO][5278] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0 goldmane-7988f88666- calico-system 7eabc093-0edd-4719-902f-c28a617adb0c 941 0 2025-09-05 23:54:28 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-18-129 goldmane-7988f88666-s9vkn eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib54d883f6ab [] [] }} ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.386 [INFO][5278] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.506 [INFO][5306] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" HandleID="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.508 [INFO][5306] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" HandleID="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3790), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-18-129", "pod":"goldmane-7988f88666-s9vkn", "timestamp":"2025-09-05 23:54:52.506898476 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.508 [INFO][5306] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.508 [INFO][5306] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.508 [INFO][5306] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.528 [INFO][5306] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.540 [INFO][5306] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.552 [INFO][5306] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.561 [INFO][5306] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.566 [INFO][5306] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.567 [INFO][5306] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.569 [INFO][5306] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52 Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.585 [INFO][5306] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.610 [INFO][5306] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.196/26] block=192.168.105.192/26 handle="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.610 [INFO][5306] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.196/26] handle="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" host="ip-172-31-18-129" Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.611 [INFO][5306] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:52.765328 containerd[2021]: 2025-09-05 23:54:52.612 [INFO][5306] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.196/26] IPv6=[] ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" HandleID="k8s-pod-network.8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.617 [INFO][5278] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"7eabc093-0edd-4719-902f-c28a617adb0c", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"goldmane-7988f88666-s9vkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib54d883f6ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.618 [INFO][5278] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.196/32] ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.618 [INFO][5278] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib54d883f6ab ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.652 [INFO][5278] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.673 [INFO][5278] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"7eabc093-0edd-4719-902f-c28a617adb0c", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52", Pod:"goldmane-7988f88666-s9vkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib54d883f6ab", MAC:"fe:81:60:47:1c:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:52.769024 containerd[2021]: 2025-09-05 23:54:52.750 [INFO][5278] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52" Namespace="calico-system" Pod="goldmane-7988f88666-s9vkn" WorkloadEndpoint="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:52.862136 systemd-networkd[1930]: cali0a6a1f850df: Gained IPv6LL Sep 5 23:54:52.908444 systemd-networkd[1930]: cali547c28ba3fc: Link UP Sep 5 23:54:52.927711 systemd-networkd[1930]: cali547c28ba3fc: Gained carrier Sep 5 23:54:53.011660 containerd[2021]: time="2025-09-05T23:54:53.011477851Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:53.014768 containerd[2021]: time="2025-09-05T23:54:53.011612335Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:53.014768 containerd[2021]: time="2025-09-05T23:54:53.011639935Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:53.014768 containerd[2021]: time="2025-09-05T23:54:53.013366591Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:53.125252 systemd[1]: Started cri-containerd-8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52.scope - libcontainer container 8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52. Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.389 [INFO][5287] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0 csi-node-driver- calico-system c05df7ab-c8cc-4af6-b78b-e2da00b65212 942 0 2025-09-05 23:54:27 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-18-129 csi-node-driver-xr256 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali547c28ba3fc [] [] }} ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.390 [INFO][5287] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.507 [INFO][5308] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" HandleID="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.509 [INFO][5308] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" HandleID="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003101e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-18-129", "pod":"csi-node-driver-xr256", "timestamp":"2025-09-05 23:54:52.507583904 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.509 [INFO][5308] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.611 [INFO][5308] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.611 [INFO][5308] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.664 [INFO][5308] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.743 [INFO][5308] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.771 [INFO][5308] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.778 [INFO][5308] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.788 [INFO][5308] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.790 [INFO][5308] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.802 [INFO][5308] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083 Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.817 [INFO][5308] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.840 [INFO][5308] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.197/26] block=192.168.105.192/26 handle="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.840 [INFO][5308] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.197/26] handle="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" host="ip-172-31-18-129" Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.840 [INFO][5308] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:53.138608 containerd[2021]: 2025-09-05 23:54:52.840 [INFO][5308] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.197/26] IPv6=[] ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" HandleID="k8s-pod-network.fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:52.875 [INFO][5287] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c05df7ab-c8cc-4af6-b78b-e2da00b65212", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"csi-node-driver-xr256", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali547c28ba3fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:52.881 [INFO][5287] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.197/32] ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:52.881 [INFO][5287] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali547c28ba3fc ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:52.934 [INFO][5287] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:52.993 [INFO][5287] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c05df7ab-c8cc-4af6-b78b-e2da00b65212", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083", Pod:"csi-node-driver-xr256", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali547c28ba3fc", MAC:"de:08:d0:fe:25:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:53.139843 containerd[2021]: 2025-09-05 23:54:53.114 [INFO][5287] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083" Namespace="calico-system" Pod="csi-node-driver-xr256" WorkloadEndpoint="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:53.352899 containerd[2021]: time="2025-09-05T23:54:53.350741181Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:53.352899 containerd[2021]: time="2025-09-05T23:54:53.350843193Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:53.352899 containerd[2021]: time="2025-09-05T23:54:53.350869149Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:53.352899 containerd[2021]: time="2025-09-05T23:54:53.351003261Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.174 [INFO][5351] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.174 [INFO][5351] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" iface="eth0" netns="/var/run/netns/cni-6993a044-dad0-ac07-ac31-6a3de2fc93d8" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.178 [INFO][5351] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" iface="eth0" netns="/var/run/netns/cni-6993a044-dad0-ac07-ac31-6a3de2fc93d8" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.182 [INFO][5351] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" iface="eth0" netns="/var/run/netns/cni-6993a044-dad0-ac07-ac31-6a3de2fc93d8" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.182 [INFO][5351] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.182 [INFO][5351] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.419 [INFO][5414] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.420 [INFO][5414] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.420 [INFO][5414] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.455 [WARNING][5414] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.455 [INFO][5414] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.461 [INFO][5414] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:53.495681 containerd[2021]: 2025-09-05 23:54:53.476 [INFO][5351] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:53.509441 systemd[1]: run-netns-cni\x2d6993a044\x2ddad0\x2dac07\x2dac31\x2d6a3de2fc93d8.mount: Deactivated successfully. Sep 5 23:54:53.527811 containerd[2021]: time="2025-09-05T23:54:53.527710485Z" level=info msg="TearDown network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" successfully" Sep 5 23:54:53.527811 containerd[2021]: time="2025-09-05T23:54:53.527795805Z" level=info msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" returns successfully" Sep 5 23:54:53.530290 containerd[2021]: time="2025-09-05T23:54:53.530237025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-9q8qs,Uid:ad610cb3-9cf1-433c-bfe6-870f9da7a8a7,Namespace:calico-apiserver,Attempt:1,}" Sep 5 23:54:53.531128 systemd[1]: Started cri-containerd-fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083.scope - libcontainer container fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083. Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.191 [INFO][5347] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.191 [INFO][5347] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" iface="eth0" netns="/var/run/netns/cni-a792c662-5675-7529-3c57-c6ebd3aec74b" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.192 [INFO][5347] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" iface="eth0" netns="/var/run/netns/cni-a792c662-5675-7529-3c57-c6ebd3aec74b" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.196 [INFO][5347] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" iface="eth0" netns="/var/run/netns/cni-a792c662-5675-7529-3c57-c6ebd3aec74b" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.196 [INFO][5347] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.196 [INFO][5347] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.488 [INFO][5416] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.488 [INFO][5416] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.488 [INFO][5416] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.554 [WARNING][5416] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.554 [INFO][5416] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.560 [INFO][5416] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:53.607243 containerd[2021]: 2025-09-05 23:54:53.576 [INFO][5347] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:53.614819 containerd[2021]: time="2025-09-05T23:54:53.612511642Z" level=info msg="TearDown network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" successfully" Sep 5 23:54:53.614819 containerd[2021]: time="2025-09-05T23:54:53.612606754Z" level=info msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" returns successfully" Sep 5 23:54:53.616728 containerd[2021]: time="2025-09-05T23:54:53.616465498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9f4498b8-wxd6q,Uid:a4bb29ad-8766-47fe-9303-a89374119066,Namespace:calico-system,Attempt:1,}" Sep 5 23:54:53.671700 containerd[2021]: time="2025-09-05T23:54:53.670615630Z" level=info msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" Sep 5 23:54:53.672274 systemd[1]: run-netns-cni\x2da792c662\x2d5675\x2d7529\x2d3c57\x2dc6ebd3aec74b.mount: Deactivated successfully. Sep 5 23:54:53.695714 systemd-networkd[1930]: calib54d883f6ab: Gained IPv6LL Sep 5 23:54:53.708474 containerd[2021]: time="2025-09-05T23:54:53.707870194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-s9vkn,Uid:7eabc093-0edd-4719-902f-c28a617adb0c,Namespace:calico-system,Attempt:1,} returns sandbox id \"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52\"" Sep 5 23:54:53.933342 containerd[2021]: time="2025-09-05T23:54:53.933195443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xr256,Uid:c05df7ab-c8cc-4af6-b78b-e2da00b65212,Namespace:calico-system,Attempt:1,} returns sandbox id \"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083\"" Sep 5 23:54:54.407428 systemd-networkd[1930]: caliae625d2b736: Link UP Sep 5 23:54:54.409426 systemd-networkd[1930]: caliae625d2b736: Gained carrier Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:53.981 [INFO][5464] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0 calico-apiserver-5f77cc885- calico-apiserver ad610cb3-9cf1-433c-bfe6-870f9da7a8a7 965 0 2025-09-05 23:54:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f77cc885 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-18-129 calico-apiserver-5f77cc885-9q8qs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliae625d2b736 [] [] }} ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:53.982 [INFO][5464] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.179 [INFO][5526] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" HandleID="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.180 [INFO][5526] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" HandleID="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031a410), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-18-129", "pod":"calico-apiserver-5f77cc885-9q8qs", "timestamp":"2025-09-05 23:54:54.179774481 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.180 [INFO][5526] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.180 [INFO][5526] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.180 [INFO][5526] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.234 [INFO][5526] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.260 [INFO][5526] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.292 [INFO][5526] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.298 [INFO][5526] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.317 [INFO][5526] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.317 [INFO][5526] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.326 [INFO][5526] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4 Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.348 [INFO][5526] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.371 [INFO][5526] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.198/26] block=192.168.105.192/26 handle="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.371 [INFO][5526] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.198/26] handle="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" host="ip-172-31-18-129" Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.372 [INFO][5526] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:54.476487 containerd[2021]: 2025-09-05 23:54:54.373 [INFO][5526] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.198/26] IPv6=[] ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" HandleID="k8s-pod-network.76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.385 [INFO][5464] cni-plugin/k8s.go 418: Populated endpoint ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"calico-apiserver-5f77cc885-9q8qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae625d2b736", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.386 [INFO][5464] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.198/32] ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.386 [INFO][5464] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliae625d2b736 ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.411 [INFO][5464] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.413 [INFO][5464] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4", Pod:"calico-apiserver-5f77cc885-9q8qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae625d2b736", MAC:"3a:fa:23:19:9c:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:54.480453 containerd[2021]: 2025-09-05 23:54:54.459 [INFO][5464] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4" Namespace="calico-apiserver" Pod="calico-apiserver-5f77cc885-9q8qs" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:54.587925 systemd-networkd[1930]: cali408802bc6df: Link UP Sep 5 23:54:54.592243 systemd-networkd[1930]: cali408802bc6df: Gained carrier Sep 5 23:54:54.636917 containerd[2021]: time="2025-09-05T23:54:54.635955107Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:54.636917 containerd[2021]: time="2025-09-05T23:54:54.636066539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:54.636917 containerd[2021]: time="2025-09-05T23:54:54.636103571Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:54.636917 containerd[2021]: time="2025-09-05T23:54:54.636262127Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.083 [INFO][5491] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.084 [INFO][5491] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" iface="eth0" netns="/var/run/netns/cni-13ee6b3e-4b85-02c4-73a1-84382148c3c9" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.085 [INFO][5491] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" iface="eth0" netns="/var/run/netns/cni-13ee6b3e-4b85-02c4-73a1-84382148c3c9" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.086 [INFO][5491] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" iface="eth0" netns="/var/run/netns/cni-13ee6b3e-4b85-02c4-73a1-84382148c3c9" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.087 [INFO][5491] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.087 [INFO][5491] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.327 [INFO][5531] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.332 [INFO][5531] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.541 [INFO][5531] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.616 [WARNING][5531] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.616 [INFO][5531] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.626 [INFO][5531] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:54.653015 containerd[2021]: 2025-09-05 23:54:54.637 [INFO][5491] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:54.671278 systemd[1]: run-netns-cni\x2d13ee6b3e\x2d4b85\x2d02c4\x2d73a1\x2d84382148c3c9.mount: Deactivated successfully. Sep 5 23:54:54.687033 containerd[2021]: time="2025-09-05T23:54:54.686920547Z" level=info msg="TearDown network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" successfully" Sep 5 23:54:54.688239 containerd[2021]: time="2025-09-05T23:54:54.687019595Z" level=info msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" returns successfully" Sep 5 23:54:54.690784 containerd[2021]: time="2025-09-05T23:54:54.689512427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kgk25,Uid:0851a585-ee51-4d36-80e5-364195a5c349,Namespace:kube-system,Attempt:1,}" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.072 [INFO][5495] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0 calico-kube-controllers-5f9f4498b8- calico-system a4bb29ad-8766-47fe-9303-a89374119066 966 0 2025-09-05 23:54:27 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5f9f4498b8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-18-129 calico-kube-controllers-5f9f4498b8-wxd6q eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali408802bc6df [] [] }} ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.072 [INFO][5495] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.292 [INFO][5533] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" HandleID="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.296 [INFO][5533] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" HandleID="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024a470), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-18-129", "pod":"calico-kube-controllers-5f9f4498b8-wxd6q", "timestamp":"2025-09-05 23:54:54.292684977 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.296 [INFO][5533] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.372 [INFO][5533] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.373 [INFO][5533] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.411 [INFO][5533] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.435 [INFO][5533] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.470 [INFO][5533] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.487 [INFO][5533] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.498 [INFO][5533] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.498 [INFO][5533] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.504 [INFO][5533] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56 Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.515 [INFO][5533] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.539 [INFO][5533] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.199/26] block=192.168.105.192/26 handle="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.540 [INFO][5533] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.199/26] handle="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" host="ip-172-31-18-129" Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.543 [INFO][5533] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:54.698556 containerd[2021]: 2025-09-05 23:54:54.543 [INFO][5533] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.199/26] IPv6=[] ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" HandleID="k8s-pod-network.88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.564 [INFO][5495] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0", GenerateName:"calico-kube-controllers-5f9f4498b8-", Namespace:"calico-system", SelfLink:"", UID:"a4bb29ad-8766-47fe-9303-a89374119066", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9f4498b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"calico-kube-controllers-5f9f4498b8-wxd6q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali408802bc6df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.568 [INFO][5495] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.199/32] ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.568 [INFO][5495] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali408802bc6df ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.598 [INFO][5495] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.602 [INFO][5495] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0", GenerateName:"calico-kube-controllers-5f9f4498b8-", Namespace:"calico-system", SelfLink:"", UID:"a4bb29ad-8766-47fe-9303-a89374119066", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9f4498b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56", Pod:"calico-kube-controllers-5f9f4498b8-wxd6q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali408802bc6df", MAC:"ca:18:ca:e5:c3:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:54.699798 containerd[2021]: 2025-09-05 23:54:54.670 [INFO][5495] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56" Namespace="calico-system" Pod="calico-kube-controllers-5f9f4498b8-wxd6q" WorkloadEndpoint="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:54.718702 systemd-networkd[1930]: cali547c28ba3fc: Gained IPv6LL Sep 5 23:54:54.777885 systemd[1]: Started cri-containerd-76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4.scope - libcontainer container 76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4. Sep 5 23:54:54.801230 systemd[1]: run-containerd-runc-k8s.io-76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4-runc.bkSjkH.mount: Deactivated successfully. Sep 5 23:54:54.902203 containerd[2021]: time="2025-09-05T23:54:54.900845004Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:54.902203 containerd[2021]: time="2025-09-05T23:54:54.901882992Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:54.902203 containerd[2021]: time="2025-09-05T23:54:54.901916856Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:54.906897 containerd[2021]: time="2025-09-05T23:54:54.906198612Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:55.026829 systemd[1]: Started cri-containerd-88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56.scope - libcontainer container 88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56. Sep 5 23:54:55.243166 containerd[2021]: time="2025-09-05T23:54:55.243106858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f77cc885-9q8qs,Uid:ad610cb3-9cf1-433c-bfe6-870f9da7a8a7,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4\"" Sep 5 23:54:55.282920 containerd[2021]: time="2025-09-05T23:54:55.282459418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9f4498b8-wxd6q,Uid:a4bb29ad-8766-47fe-9303-a89374119066,Namespace:calico-system,Attempt:1,} returns sandbox id \"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56\"" Sep 5 23:54:55.395754 systemd-networkd[1930]: cali46d1d7c71e0: Link UP Sep 5 23:54:55.398607 systemd-networkd[1930]: cali46d1d7c71e0: Gained carrier Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.084 [INFO][5597] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0 coredns-7c65d6cfc9- kube-system 0851a585-ee51-4d36-80e5-364195a5c349 977 0 2025-09-05 23:54:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-18-129 coredns-7c65d6cfc9-kgk25 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali46d1d7c71e0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.086 [INFO][5597] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.277 [INFO][5647] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" HandleID="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.279 [INFO][5647] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" HandleID="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3b10), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-18-129", "pod":"coredns-7c65d6cfc9-kgk25", "timestamp":"2025-09-05 23:54:55.277581034 +0000 UTC"}, Hostname:"ip-172-31-18-129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.280 [INFO][5647] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.281 [INFO][5647] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.281 [INFO][5647] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-18-129' Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.304 [INFO][5647] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.316 [INFO][5647] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.327 [INFO][5647] ipam/ipam.go 511: Trying affinity for 192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.332 [INFO][5647] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.340 [INFO][5647] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.192/26 host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.340 [INFO][5647] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.192/26 handle="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.345 [INFO][5647] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.358 [INFO][5647] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.192/26 handle="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.379 [INFO][5647] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.200/26] block=192.168.105.192/26 handle="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.379 [INFO][5647] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.200/26] handle="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" host="ip-172-31-18-129" Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.379 [INFO][5647] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:55.438263 containerd[2021]: 2025-09-05 23:54:55.379 [INFO][5647] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.200/26] IPv6=[] ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" HandleID="k8s-pod-network.f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.386 [INFO][5597] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0851a585-ee51-4d36-80e5-364195a5c349", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"", Pod:"coredns-7c65d6cfc9-kgk25", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali46d1d7c71e0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.387 [INFO][5597] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.200/32] ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.387 [INFO][5597] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali46d1d7c71e0 ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.397 [INFO][5597] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.400 [INFO][5597] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0851a585-ee51-4d36-80e5-364195a5c349", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c", Pod:"coredns-7c65d6cfc9-kgk25", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali46d1d7c71e0", MAC:"16:2e:40:f5:73:20", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:55.441121 containerd[2021]: 2025-09-05 23:54:55.431 [INFO][5597] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kgk25" WorkloadEndpoint="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:55.506410 containerd[2021]: time="2025-09-05T23:54:55.506068583Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:54:55.509860 containerd[2021]: time="2025-09-05T23:54:55.506183783Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:54:55.509860 containerd[2021]: time="2025-09-05T23:54:55.508577267Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:55.511785 containerd[2021]: time="2025-09-05T23:54:55.511271171Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:54:55.557930 systemd[1]: Started cri-containerd-f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c.scope - libcontainer container f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c. Sep 5 23:54:55.695965 containerd[2021]: time="2025-09-05T23:54:55.695901996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kgk25,Uid:0851a585-ee51-4d36-80e5-364195a5c349,Namespace:kube-system,Attempt:1,} returns sandbox id \"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c\"" Sep 5 23:54:55.706062 containerd[2021]: time="2025-09-05T23:54:55.705801396Z" level=info msg="CreateContainer within sandbox \"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 23:54:55.741854 systemd-networkd[1930]: caliae625d2b736: Gained IPv6LL Sep 5 23:54:55.756421 containerd[2021]: time="2025-09-05T23:54:55.756345228Z" level=info msg="CreateContainer within sandbox \"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968\"" Sep 5 23:54:55.757862 containerd[2021]: time="2025-09-05T23:54:55.757692816Z" level=info msg="StartContainer for \"4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968\"" Sep 5 23:54:55.843846 systemd[1]: Started cri-containerd-4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968.scope - libcontainer container 4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968. Sep 5 23:54:55.943627 containerd[2021]: time="2025-09-05T23:54:55.943401397Z" level=info msg="StartContainer for \"4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968\" returns successfully" Sep 5 23:54:55.999879 systemd-networkd[1930]: cali408802bc6df: Gained IPv6LL Sep 5 23:54:56.339631 kubelet[3339]: I0905 23:54:56.339335 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-kgk25" podStartSLOduration=54.339284255 podStartE2EDuration="54.339284255s" podCreationTimestamp="2025-09-05 23:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:54:56.338255927 +0000 UTC m=+59.919318574" watchObservedRunningTime="2025-09-05 23:54:56.339284255 +0000 UTC m=+59.920346890" Sep 5 23:54:56.650131 containerd[2021]: time="2025-09-05T23:54:56.649672885Z" level=info msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" Sep 5 23:54:56.673632 systemd[1]: run-containerd-runc-k8s.io-4e4f976648af9923af9023509152d2e1c90735574aeb5028a07cb0758de8e968-runc.tqS9rK.mount: Deactivated successfully. Sep 5 23:54:56.794049 containerd[2021]: time="2025-09-05T23:54:56.793976306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:56.799073 containerd[2021]: time="2025-09-05T23:54:56.798589310Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 5 23:54:56.806072 containerd[2021]: time="2025-09-05T23:54:56.805553918Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:56.816092 containerd[2021]: time="2025-09-05T23:54:56.816012950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:54:56.820314 containerd[2021]: time="2025-09-05T23:54:56.819979346Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 5.247983858s" Sep 5 23:54:56.820314 containerd[2021]: time="2025-09-05T23:54:56.820053110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 5 23:54:56.830745 systemd-networkd[1930]: cali46d1d7c71e0: Gained IPv6LL Sep 5 23:54:56.833645 containerd[2021]: time="2025-09-05T23:54:56.833594294Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 5 23:54:56.834919 containerd[2021]: time="2025-09-05T23:54:56.834421358Z" level=info msg="CreateContainer within sandbox \"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.771 [WARNING][5769] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.771 [INFO][5769] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.771 [INFO][5769] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" iface="eth0" netns="" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.772 [INFO][5769] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.772 [INFO][5769] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.835 [INFO][5782] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.835 [INFO][5782] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.835 [INFO][5782] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.858 [WARNING][5782] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.859 [INFO][5782] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.862 [INFO][5782] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:56.870039 containerd[2021]: 2025-09-05 23:54:56.866 [INFO][5769] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:56.871246 containerd[2021]: time="2025-09-05T23:54:56.870325910Z" level=info msg="TearDown network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" successfully" Sep 5 23:54:56.871246 containerd[2021]: time="2025-09-05T23:54:56.870416858Z" level=info msg="StopPodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" returns successfully" Sep 5 23:54:56.872263 containerd[2021]: time="2025-09-05T23:54:56.871705274Z" level=info msg="RemovePodSandbox for \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" Sep 5 23:54:56.872263 containerd[2021]: time="2025-09-05T23:54:56.871762898Z" level=info msg="Forcibly stopping sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\"" Sep 5 23:54:56.896166 containerd[2021]: time="2025-09-05T23:54:56.895942106Z" level=info msg="CreateContainer within sandbox \"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"46bf39712edec5656ea025a1467d3a826a8753dd2dbd6638441d8602fc609d05\"" Sep 5 23:54:56.908728 containerd[2021]: time="2025-09-05T23:54:56.903008870Z" level=info msg="StartContainer for \"46bf39712edec5656ea025a1467d3a826a8753dd2dbd6638441d8602fc609d05\"" Sep 5 23:54:56.907870 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2824900211.mount: Deactivated successfully. Sep 5 23:54:57.080246 systemd[1]: Started cri-containerd-46bf39712edec5656ea025a1467d3a826a8753dd2dbd6638441d8602fc609d05.scope - libcontainer container 46bf39712edec5656ea025a1467d3a826a8753dd2dbd6638441d8602fc609d05. Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:56.962 [WARNING][5796] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" WorkloadEndpoint="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:56.963 [INFO][5796] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:56.963 [INFO][5796] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" iface="eth0" netns="" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:56.963 [INFO][5796] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:56.963 [INFO][5796] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.079 [INFO][5806] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.084 [INFO][5806] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.085 [INFO][5806] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.108 [WARNING][5806] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.108 [INFO][5806] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" HandleID="k8s-pod-network.e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Workload="ip--172--31--18--129-k8s-whisker--5bcc458d6d--445tw-eth0" Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.111 [INFO][5806] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:57.118114 containerd[2021]: 2025-09-05 23:54:57.115 [INFO][5796] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8" Sep 5 23:54:57.119307 containerd[2021]: time="2025-09-05T23:54:57.119073167Z" level=info msg="TearDown network for sandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" successfully" Sep 5 23:54:57.130654 containerd[2021]: time="2025-09-05T23:54:57.130280879Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:57.130654 containerd[2021]: time="2025-09-05T23:54:57.130384799Z" level=info msg="RemovePodSandbox \"e74264ddf81c8ab82927c31ec3e8029169d584e385296c4019594d5fc62fabe8\" returns successfully" Sep 5 23:54:57.131874 containerd[2021]: time="2025-09-05T23:54:57.131814875Z" level=info msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" Sep 5 23:54:57.310559 containerd[2021]: time="2025-09-05T23:54:57.309952776Z" level=info msg="StartContainer for \"46bf39712edec5656ea025a1467d3a826a8753dd2dbd6638441d8602fc609d05\" returns successfully" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.249 [WARNING][5842] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"7eabc093-0edd-4719-902f-c28a617adb0c", ResourceVersion:"961", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52", Pod:"goldmane-7988f88666-s9vkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib54d883f6ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.249 [INFO][5842] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.249 [INFO][5842] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" iface="eth0" netns="" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.249 [INFO][5842] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.250 [INFO][5842] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.370 [INFO][5850] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.372 [INFO][5850] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.372 [INFO][5850] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.431 [WARNING][5850] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.433 [INFO][5850] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.444 [INFO][5850] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:57.463108 containerd[2021]: 2025-09-05 23:54:57.458 [INFO][5842] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.464614 containerd[2021]: time="2025-09-05T23:54:57.463692697Z" level=info msg="TearDown network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" successfully" Sep 5 23:54:57.464614 containerd[2021]: time="2025-09-05T23:54:57.463733737Z" level=info msg="StopPodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" returns successfully" Sep 5 23:54:57.464614 containerd[2021]: time="2025-09-05T23:54:57.464469961Z" level=info msg="RemovePodSandbox for \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" Sep 5 23:54:57.464614 containerd[2021]: time="2025-09-05T23:54:57.464604985Z" level=info msg="Forcibly stopping sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\"" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.611 [WARNING][5888] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"7eabc093-0edd-4719-902f-c28a617adb0c", ResourceVersion:"961", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52", Pod:"goldmane-7988f88666-s9vkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib54d883f6ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.612 [INFO][5888] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.612 [INFO][5888] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" iface="eth0" netns="" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.612 [INFO][5888] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.612 [INFO][5888] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.697 [INFO][5908] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.697 [INFO][5908] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.697 [INFO][5908] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.717 [WARNING][5908] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.718 [INFO][5908] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" HandleID="k8s-pod-network.f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Workload="ip--172--31--18--129-k8s-goldmane--7988f88666--s9vkn-eth0" Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.720 [INFO][5908] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:57.727489 containerd[2021]: 2025-09-05 23:54:57.724 [INFO][5888] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f" Sep 5 23:54:57.728958 containerd[2021]: time="2025-09-05T23:54:57.727462178Z" level=info msg="TearDown network for sandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" successfully" Sep 5 23:54:57.738013 containerd[2021]: time="2025-09-05T23:54:57.737877242Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:57.738307 containerd[2021]: time="2025-09-05T23:54:57.738042638Z" level=info msg="RemovePodSandbox \"f57327d6b8b64da69c61801763ca4e4fa623edf6a17bad728ac90e6c7bdfbd6f\" returns successfully" Sep 5 23:54:57.739115 containerd[2021]: time="2025-09-05T23:54:57.739061222Z" level=info msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.839 [WARNING][5930] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4", Pod:"calico-apiserver-5f77cc885-9q8qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae625d2b736", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.839 [INFO][5930] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.839 [INFO][5930] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" iface="eth0" netns="" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.839 [INFO][5930] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.839 [INFO][5930] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.897 [INFO][5942] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.898 [INFO][5942] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.898 [INFO][5942] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.935 [WARNING][5942] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.935 [INFO][5942] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.940 [INFO][5942] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:57.951658 containerd[2021]: 2025-09-05 23:54:57.944 [INFO][5930] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:57.954052 containerd[2021]: time="2025-09-05T23:54:57.953511075Z" level=info msg="TearDown network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" successfully" Sep 5 23:54:57.954052 containerd[2021]: time="2025-09-05T23:54:57.953785623Z" level=info msg="StopPodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" returns successfully" Sep 5 23:54:57.955330 containerd[2021]: time="2025-09-05T23:54:57.955242915Z" level=info msg="RemovePodSandbox for \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" Sep 5 23:54:57.955330 containerd[2021]: time="2025-09-05T23:54:57.955304295Z" level=info msg="Forcibly stopping sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\"" Sep 5 23:54:58.071185 kubelet[3339]: I0905 23:54:58.069673 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f77cc885-km8wb" podStartSLOduration=36.05456457 podStartE2EDuration="42.069651444s" podCreationTimestamp="2025-09-05 23:54:16 +0000 UTC" firstStartedPulling="2025-09-05 23:54:50.807978128 +0000 UTC m=+54.389040751" lastFinishedPulling="2025-09-05 23:54:56.823065014 +0000 UTC m=+60.404127625" observedRunningTime="2025-09-05 23:54:57.385499605 +0000 UTC m=+60.966562240" watchObservedRunningTime="2025-09-05 23:54:58.069651444 +0000 UTC m=+61.650714067" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.122 [WARNING][5957] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"ad610cb3-9cf1-433c-bfe6-870f9da7a8a7", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4", Pod:"calico-apiserver-5f77cc885-9q8qs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae625d2b736", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.122 [INFO][5957] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.122 [INFO][5957] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" iface="eth0" netns="" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.123 [INFO][5957] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.123 [INFO][5957] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.174 [INFO][5965] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.177 [INFO][5965] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.178 [INFO][5965] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.196 [WARNING][5965] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.196 [INFO][5965] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" HandleID="k8s-pod-network.b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--9q8qs-eth0" Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.199 [INFO][5965] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:58.208673 containerd[2021]: 2025-09-05 23:54:58.203 [INFO][5957] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233" Sep 5 23:54:58.208673 containerd[2021]: time="2025-09-05T23:54:58.207830761Z" level=info msg="TearDown network for sandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" successfully" Sep 5 23:54:58.216470 containerd[2021]: time="2025-09-05T23:54:58.216231445Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:58.216470 containerd[2021]: time="2025-09-05T23:54:58.216323053Z" level=info msg="RemovePodSandbox \"b38fe67c80dd750fc10ffdbb79bbe6483c7961bb1f61431b28a6cdd0eda88233\" returns successfully" Sep 5 23:54:58.217563 containerd[2021]: time="2025-09-05T23:54:58.217084525Z" level=info msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" Sep 5 23:54:58.345610 kubelet[3339]: I0905 23:54:58.345175 3339 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.301 [WARNING][5980] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0", GenerateName:"calico-kube-controllers-5f9f4498b8-", Namespace:"calico-system", SelfLink:"", UID:"a4bb29ad-8766-47fe-9303-a89374119066", ResourceVersion:"983", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9f4498b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56", Pod:"calico-kube-controllers-5f9f4498b8-wxd6q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali408802bc6df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.302 [INFO][5980] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.302 [INFO][5980] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" iface="eth0" netns="" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.302 [INFO][5980] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.302 [INFO][5980] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.372 [INFO][5988] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.373 [INFO][5988] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.373 [INFO][5988] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.390 [WARNING][5988] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.390 [INFO][5988] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.393 [INFO][5988] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:58.400597 containerd[2021]: 2025-09-05 23:54:58.397 [INFO][5980] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.400597 containerd[2021]: time="2025-09-05T23:54:58.400483250Z" level=info msg="TearDown network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" successfully" Sep 5 23:54:58.400597 containerd[2021]: time="2025-09-05T23:54:58.400537910Z" level=info msg="StopPodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" returns successfully" Sep 5 23:54:58.402855 containerd[2021]: time="2025-09-05T23:54:58.402800642Z" level=info msg="RemovePodSandbox for \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" Sep 5 23:54:58.403013 containerd[2021]: time="2025-09-05T23:54:58.402908030Z" level=info msg="Forcibly stopping sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\"" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.485 [WARNING][6003] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0", GenerateName:"calico-kube-controllers-5f9f4498b8-", Namespace:"calico-system", SelfLink:"", UID:"a4bb29ad-8766-47fe-9303-a89374119066", ResourceVersion:"983", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9f4498b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56", Pod:"calico-kube-controllers-5f9f4498b8-wxd6q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali408802bc6df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.485 [INFO][6003] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.485 [INFO][6003] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" iface="eth0" netns="" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.485 [INFO][6003] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.485 [INFO][6003] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.548 [INFO][6010] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.548 [INFO][6010] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.549 [INFO][6010] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.574 [WARNING][6010] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.574 [INFO][6010] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" HandleID="k8s-pod-network.3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Workload="ip--172--31--18--129-k8s-calico--kube--controllers--5f9f4498b8--wxd6q-eth0" Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.581 [INFO][6010] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:58.594599 containerd[2021]: 2025-09-05 23:54:58.587 [INFO][6003] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586" Sep 5 23:54:58.595466 containerd[2021]: time="2025-09-05T23:54:58.594726147Z" level=info msg="TearDown network for sandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" successfully" Sep 5 23:54:58.607248 containerd[2021]: time="2025-09-05T23:54:58.606902643Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:58.607248 containerd[2021]: time="2025-09-05T23:54:58.607046079Z" level=info msg="RemovePodSandbox \"3741559100d864e11050ce21d8fec747aa3ef6cf28222dcdbb461472ad73d586\" returns successfully" Sep 5 23:54:58.610427 containerd[2021]: time="2025-09-05T23:54:58.610354995Z" level=info msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.703 [WARNING][6024] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0851a585-ee51-4d36-80e5-364195a5c349", ResourceVersion:"1000", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c", Pod:"coredns-7c65d6cfc9-kgk25", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali46d1d7c71e0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.703 [INFO][6024] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.703 [INFO][6024] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" iface="eth0" netns="" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.703 [INFO][6024] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.703 [INFO][6024] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.762 [INFO][6031] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.763 [INFO][6031] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.763 [INFO][6031] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.783 [WARNING][6031] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.783 [INFO][6031] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.788 [INFO][6031] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:58.795425 containerd[2021]: 2025-09-05 23:54:58.791 [INFO][6024] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:58.795425 containerd[2021]: time="2025-09-05T23:54:58.795362644Z" level=info msg="TearDown network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" successfully" Sep 5 23:54:58.795425 containerd[2021]: time="2025-09-05T23:54:58.795402736Z" level=info msg="StopPodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" returns successfully" Sep 5 23:54:58.799291 containerd[2021]: time="2025-09-05T23:54:58.796933972Z" level=info msg="RemovePodSandbox for \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" Sep 5 23:54:58.799291 containerd[2021]: time="2025-09-05T23:54:58.796986148Z" level=info msg="Forcibly stopping sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\"" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:58.919 [WARNING][6045] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0851a585-ee51-4d36-80e5-364195a5c349", ResourceVersion:"1000", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"f8c01c740469271b955a2b64b74216745e7435cea7dd713cb79f504a2cd0758c", Pod:"coredns-7c65d6cfc9-kgk25", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali46d1d7c71e0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:58.919 [INFO][6045] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:58.919 [INFO][6045] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" iface="eth0" netns="" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:58.919 [INFO][6045] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:58.919 [INFO][6045] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.102 [INFO][6053] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.102 [INFO][6053] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.102 [INFO][6053] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.157 [WARNING][6053] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.157 [INFO][6053] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" HandleID="k8s-pod-network.8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--kgk25-eth0" Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.161 [INFO][6053] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:59.170756 containerd[2021]: 2025-09-05 23:54:59.166 [INFO][6045] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5" Sep 5 23:54:59.171643 containerd[2021]: time="2025-09-05T23:54:59.170949253Z" level=info msg="TearDown network for sandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" successfully" Sep 5 23:54:59.180458 containerd[2021]: time="2025-09-05T23:54:59.180381025Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:59.180639 containerd[2021]: time="2025-09-05T23:54:59.180479929Z" level=info msg="RemovePodSandbox \"8f3b317b5267341d5d8ec1ff4aec8512cbb3909cfb1594287f99bd429e9dcbb5\" returns successfully" Sep 5 23:54:59.181785 containerd[2021]: time="2025-09-05T23:54:59.181267453Z" level=info msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.261 [WARNING][6067] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c05df7ab-c8cc-4af6-b78b-e2da00b65212", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083", Pod:"csi-node-driver-xr256", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali547c28ba3fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.265 [INFO][6067] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.265 [INFO][6067] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" iface="eth0" netns="" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.266 [INFO][6067] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.266 [INFO][6067] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.319 [INFO][6074] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.320 [INFO][6074] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.320 [INFO][6074] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.364 [WARNING][6074] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.365 [INFO][6074] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.381 [INFO][6074] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:59.389774 containerd[2021]: 2025-09-05 23:54:59.383 [INFO][6067] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.392026 containerd[2021]: time="2025-09-05T23:54:59.391665687Z" level=info msg="TearDown network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" successfully" Sep 5 23:54:59.392026 containerd[2021]: time="2025-09-05T23:54:59.391715439Z" level=info msg="StopPodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" returns successfully" Sep 5 23:54:59.393157 containerd[2021]: time="2025-09-05T23:54:59.392959419Z" level=info msg="RemovePodSandbox for \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" Sep 5 23:54:59.393276 containerd[2021]: time="2025-09-05T23:54:59.393172923Z" level=info msg="Forcibly stopping sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\"" Sep 5 23:54:59.675645 ntpd[1992]: Listen normally on 7 vxlan.calico 192.168.105.192:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 7 vxlan.calico 192.168.105.192:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 8 cali276b13c3c74 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 9 vxlan.calico [fe80::643f:fcff:fe83:35%5]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 10 cali869289ca11c [fe80::ecee:eeff:feee:eeee%8]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 11 cali0a6a1f850df [fe80::ecee:eeff:feee:eeee%9]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 12 calib54d883f6ab [fe80::ecee:eeff:feee:eeee%10]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 13 cali547c28ba3fc [fe80::ecee:eeff:feee:eeee%11]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 14 caliae625d2b736 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 15 cali408802bc6df [fe80::ecee:eeff:feee:eeee%13]:123 Sep 5 23:54:59.681108 ntpd[1992]: 5 Sep 23:54:59 ntpd[1992]: Listen normally on 16 cali46d1d7c71e0 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 5 23:54:59.675787 ntpd[1992]: Listen normally on 8 cali276b13c3c74 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 5 23:54:59.675871 ntpd[1992]: Listen normally on 9 vxlan.calico [fe80::643f:fcff:fe83:35%5]:123 Sep 5 23:54:59.675941 ntpd[1992]: Listen normally on 10 cali869289ca11c [fe80::ecee:eeff:feee:eeee%8]:123 Sep 5 23:54:59.676013 ntpd[1992]: Listen normally on 11 cali0a6a1f850df [fe80::ecee:eeff:feee:eeee%9]:123 Sep 5 23:54:59.676080 ntpd[1992]: Listen normally on 12 calib54d883f6ab [fe80::ecee:eeff:feee:eeee%10]:123 Sep 5 23:54:59.676168 ntpd[1992]: Listen normally on 13 cali547c28ba3fc [fe80::ecee:eeff:feee:eeee%11]:123 Sep 5 23:54:59.676247 ntpd[1992]: Listen normally on 14 caliae625d2b736 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 5 23:54:59.676316 ntpd[1992]: Listen normally on 15 cali408802bc6df [fe80::ecee:eeff:feee:eeee%13]:123 Sep 5 23:54:59.676395 ntpd[1992]: Listen normally on 16 cali46d1d7c71e0 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.630 [WARNING][6088] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c05df7ab-c8cc-4af6-b78b-e2da00b65212", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083", Pod:"csi-node-driver-xr256", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali547c28ba3fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.631 [INFO][6088] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.631 [INFO][6088] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" iface="eth0" netns="" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.631 [INFO][6088] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.631 [INFO][6088] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.707 [INFO][6098] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.710 [INFO][6098] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.710 [INFO][6098] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.735 [WARNING][6098] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.735 [INFO][6098] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" HandleID="k8s-pod-network.95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Workload="ip--172--31--18--129-k8s-csi--node--driver--xr256-eth0" Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.740 [INFO][6098] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:59.747917 containerd[2021]: 2025-09-05 23:54:59.744 [INFO][6088] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6" Sep 5 23:54:59.748801 containerd[2021]: time="2025-09-05T23:54:59.747964492Z" level=info msg="TearDown network for sandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" successfully" Sep 5 23:54:59.758299 containerd[2021]: time="2025-09-05T23:54:59.758205220Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:54:59.758439 containerd[2021]: time="2025-09-05T23:54:59.758321416Z" level=info msg="RemovePodSandbox \"95a0f04c0f26bda4719b35d5fb8633a86f2c651d9a443f8ef81207961f5fa5d6\" returns successfully" Sep 5 23:54:59.759844 containerd[2021]: time="2025-09-05T23:54:59.759780472Z" level=info msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" Sep 5 23:54:59.851023 systemd[1]: Started sshd@9-172.31.18.129:22-139.178.68.195:36240.service - OpenSSH per-connection server daemon (139.178.68.195:36240). Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.901 [WARNING][6113] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea", ResourceVersion:"1013", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd", Pod:"calico-apiserver-5f77cc885-km8wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali869289ca11c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.904 [INFO][6113] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.904 [INFO][6113] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" iface="eth0" netns="" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.904 [INFO][6113] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.904 [INFO][6113] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.968 [INFO][6122] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.969 [INFO][6122] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.969 [INFO][6122] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.984 [WARNING][6122] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.984 [INFO][6122] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.987 [INFO][6122] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:54:59.992650 containerd[2021]: 2025-09-05 23:54:59.989 [INFO][6113] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:54:59.993808 containerd[2021]: time="2025-09-05T23:54:59.992749602Z" level=info msg="TearDown network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" successfully" Sep 5 23:54:59.993808 containerd[2021]: time="2025-09-05T23:54:59.992798262Z" level=info msg="StopPodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" returns successfully" Sep 5 23:54:59.995138 containerd[2021]: time="2025-09-05T23:54:59.995063034Z" level=info msg="RemovePodSandbox for \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" Sep 5 23:54:59.995138 containerd[2021]: time="2025-09-05T23:54:59.995128986Z" level=info msg="Forcibly stopping sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\"" Sep 5 23:55:00.082316 sshd[6119]: Accepted publickey for core from 139.178.68.195 port 36240 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:00.085652 sshd[6119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:00.107267 systemd-logind[1997]: New session 10 of user core. Sep 5 23:55:00.111843 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.100 [WARNING][6137] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0", GenerateName:"calico-apiserver-5f77cc885-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd5f7f5e-b0cb-44ef-aa90-0246eeadd9ea", ResourceVersion:"1013", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f77cc885", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"0457f5fcb97bb0f5a9ae26341666660ee644d9b35453f54e5876489b0705f3bd", Pod:"calico-apiserver-5f77cc885-km8wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali869289ca11c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.103 [INFO][6137] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.103 [INFO][6137] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" iface="eth0" netns="" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.103 [INFO][6137] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.110 [INFO][6137] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.198 [INFO][6145] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.204 [INFO][6145] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.204 [INFO][6145] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.240 [WARNING][6145] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.240 [INFO][6145] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" HandleID="k8s-pod-network.e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Workload="ip--172--31--18--129-k8s-calico--apiserver--5f77cc885--km8wb-eth0" Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.254 [INFO][6145] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:55:00.293223 containerd[2021]: 2025-09-05 23:55:00.288 [INFO][6137] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd" Sep 5 23:55:00.295780 containerd[2021]: time="2025-09-05T23:55:00.293275239Z" level=info msg="TearDown network for sandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" successfully" Sep 5 23:55:00.428004 containerd[2021]: time="2025-09-05T23:55:00.427928608Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:55:00.429588 containerd[2021]: time="2025-09-05T23:55:00.428391388Z" level=info msg="RemovePodSandbox \"e4c0b1d3a034436f1346cae01a732d5eb9572387bb7a2b735196e320ebee04bd\" returns successfully" Sep 5 23:55:00.430306 containerd[2021]: time="2025-09-05T23:55:00.430247248Z" level=info msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" Sep 5 23:55:00.541879 sshd[6119]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:00.555995 systemd[1]: sshd@9-172.31.18.129:22-139.178.68.195:36240.service: Deactivated successfully. Sep 5 23:55:00.566439 systemd[1]: session-10.scope: Deactivated successfully. Sep 5 23:55:00.576622 systemd-logind[1997]: Session 10 logged out. Waiting for processes to exit. Sep 5 23:55:00.581051 systemd-logind[1997]: Removed session 10. Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.623 [WARNING][6168] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"08545de3-6592-4965-ae61-4807250e2870", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae", Pod:"coredns-7c65d6cfc9-6vhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a6a1f850df", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.624 [INFO][6168] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.624 [INFO][6168] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" iface="eth0" netns="" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.624 [INFO][6168] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.624 [INFO][6168] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.756 [INFO][6183] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.757 [INFO][6183] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.758 [INFO][6183] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.788 [WARNING][6183] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.788 [INFO][6183] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.791 [INFO][6183] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:55:00.810193 containerd[2021]: 2025-09-05 23:55:00.800 [INFO][6168] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:00.810193 containerd[2021]: time="2025-09-05T23:55:00.810149526Z" level=info msg="TearDown network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" successfully" Sep 5 23:55:00.810193 containerd[2021]: time="2025-09-05T23:55:00.810188586Z" level=info msg="StopPodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" returns successfully" Sep 5 23:55:00.814579 containerd[2021]: time="2025-09-05T23:55:00.812894922Z" level=info msg="RemovePodSandbox for \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" Sep 5 23:55:00.814579 containerd[2021]: time="2025-09-05T23:55:00.813475542Z" level=info msg="Forcibly stopping sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\"" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.006 [WARNING][6197] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"08545de3-6592-4965-ae61-4807250e2870", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 54, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-18-129", ContainerID:"307965b3e482671ca42f34e009b639c5ecaaefb2a9ad36ababaa00a97c0c4fae", Pod:"coredns-7c65d6cfc9-6vhqh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a6a1f850df", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.008 [INFO][6197] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.008 [INFO][6197] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" iface="eth0" netns="" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.008 [INFO][6197] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.008 [INFO][6197] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.146 [INFO][6204] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.146 [INFO][6204] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.147 [INFO][6204] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.192 [WARNING][6204] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.192 [INFO][6204] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" HandleID="k8s-pod-network.b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Workload="ip--172--31--18--129-k8s-coredns--7c65d6cfc9--6vhqh-eth0" Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.214 [INFO][6204] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:55:01.232995 containerd[2021]: 2025-09-05 23:55:01.225 [INFO][6197] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9" Sep 5 23:55:01.232995 containerd[2021]: time="2025-09-05T23:55:01.232944184Z" level=info msg="TearDown network for sandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" successfully" Sep 5 23:55:01.263872 containerd[2021]: time="2025-09-05T23:55:01.262825696Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 5 23:55:01.263872 containerd[2021]: time="2025-09-05T23:55:01.263079736Z" level=info msg="RemovePodSandbox \"b403d3b6284cea852e84d3ddb284ad6781b42bb92062a6d96a0359b8268f5fa9\" returns successfully" Sep 5 23:55:01.989365 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3550252962.mount: Deactivated successfully. Sep 5 23:55:02.979476 containerd[2021]: time="2025-09-05T23:55:02.979396760Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:02.981566 containerd[2021]: time="2025-09-05T23:55:02.981282620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 5 23:55:02.983832 containerd[2021]: time="2025-09-05T23:55:02.983759540Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:02.991635 containerd[2021]: time="2025-09-05T23:55:02.990676796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:02.993615 containerd[2021]: time="2025-09-05T23:55:02.993562136Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 6.15889881s" Sep 5 23:55:02.993787 containerd[2021]: time="2025-09-05T23:55:02.993757748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 5 23:55:02.996022 containerd[2021]: time="2025-09-05T23:55:02.995963180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 5 23:55:03.000475 containerd[2021]: time="2025-09-05T23:55:03.000279868Z" level=info msg="CreateContainer within sandbox \"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 5 23:55:03.029896 containerd[2021]: time="2025-09-05T23:55:03.029840705Z" level=info msg="CreateContainer within sandbox \"8590114b1e6ebace1f056b394b9f54005b4c33f95ac01aab2c6ec88a0a3a8f52\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf\"" Sep 5 23:55:03.033376 containerd[2021]: time="2025-09-05T23:55:03.031844993Z" level=info msg="StartContainer for \"226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf\"" Sep 5 23:55:03.113091 systemd[1]: run-containerd-runc-k8s.io-226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf-runc.yobQh6.mount: Deactivated successfully. Sep 5 23:55:03.127893 systemd[1]: Started cri-containerd-226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf.scope - libcontainer container 226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf. Sep 5 23:55:03.243641 containerd[2021]: time="2025-09-05T23:55:03.242880846Z" level=info msg="StartContainer for \"226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf\" returns successfully" Sep 5 23:55:03.442843 kubelet[3339]: I0905 23:55:03.441692 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-s9vkn" podStartSLOduration=26.160716573 podStartE2EDuration="35.441663475s" podCreationTimestamp="2025-09-05 23:54:28 +0000 UTC" firstStartedPulling="2025-09-05 23:54:53.71477347 +0000 UTC m=+57.295836093" lastFinishedPulling="2025-09-05 23:55:02.9957203 +0000 UTC m=+66.576782995" observedRunningTime="2025-09-05 23:55:03.439239511 +0000 UTC m=+67.020302158" watchObservedRunningTime="2025-09-05 23:55:03.441663475 +0000 UTC m=+67.022726098" Sep 5 23:55:04.546382 containerd[2021]: time="2025-09-05T23:55:04.545943080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:04.548668 containerd[2021]: time="2025-09-05T23:55:04.548562176Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 5 23:55:04.553019 containerd[2021]: time="2025-09-05T23:55:04.552815048Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:04.562218 containerd[2021]: time="2025-09-05T23:55:04.562122740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:04.565436 containerd[2021]: time="2025-09-05T23:55:04.565350824Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.569323456s" Sep 5 23:55:04.565436 containerd[2021]: time="2025-09-05T23:55:04.565431692Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 5 23:55:04.573558 containerd[2021]: time="2025-09-05T23:55:04.571967624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 23:55:04.573558 containerd[2021]: time="2025-09-05T23:55:04.573079664Z" level=info msg="CreateContainer within sandbox \"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 5 23:55:04.614245 containerd[2021]: time="2025-09-05T23:55:04.614166476Z" level=info msg="CreateContainer within sandbox \"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"658f314d93e4fc19920b64d827aec8d40ac6fa2c87294cddfdf5b830902e4c01\"" Sep 5 23:55:04.617997 containerd[2021]: time="2025-09-05T23:55:04.617919032Z" level=info msg="StartContainer for \"658f314d93e4fc19920b64d827aec8d40ac6fa2c87294cddfdf5b830902e4c01\"" Sep 5 23:55:04.761876 systemd[1]: Started cri-containerd-658f314d93e4fc19920b64d827aec8d40ac6fa2c87294cddfdf5b830902e4c01.scope - libcontainer container 658f314d93e4fc19920b64d827aec8d40ac6fa2c87294cddfdf5b830902e4c01. Sep 5 23:55:04.856454 containerd[2021]: time="2025-09-05T23:55:04.852918634Z" level=info msg="StartContainer for \"658f314d93e4fc19920b64d827aec8d40ac6fa2c87294cddfdf5b830902e4c01\" returns successfully" Sep 5 23:55:04.930408 containerd[2021]: time="2025-09-05T23:55:04.930280762Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:04.933585 containerd[2021]: time="2025-09-05T23:55:04.932743582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 5 23:55:04.938789 containerd[2021]: time="2025-09-05T23:55:04.938678650Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 366.63995ms" Sep 5 23:55:04.939028 containerd[2021]: time="2025-09-05T23:55:04.938990842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 5 23:55:04.942166 containerd[2021]: time="2025-09-05T23:55:04.941819242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 5 23:55:04.947007 containerd[2021]: time="2025-09-05T23:55:04.945247438Z" level=info msg="CreateContainer within sandbox \"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 23:55:04.972088 containerd[2021]: time="2025-09-05T23:55:04.971950510Z" level=info msg="CreateContainer within sandbox \"76ba762feb8beb164b99534cc4a5516fa77e0de4d94a0f8baf85f5ec52cf0ae4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"102c317730d08f8d8fb32550814242a7cf5f715ca1266e704cd9809e6b5597ff\"" Sep 5 23:55:04.973417 containerd[2021]: time="2025-09-05T23:55:04.973214134Z" level=info msg="StartContainer for \"102c317730d08f8d8fb32550814242a7cf5f715ca1266e704cd9809e6b5597ff\"" Sep 5 23:55:05.035843 systemd[1]: Started cri-containerd-102c317730d08f8d8fb32550814242a7cf5f715ca1266e704cd9809e6b5597ff.scope - libcontainer container 102c317730d08f8d8fb32550814242a7cf5f715ca1266e704cd9809e6b5597ff. Sep 5 23:55:05.109243 containerd[2021]: time="2025-09-05T23:55:05.108979543Z" level=info msg="StartContainer for \"102c317730d08f8d8fb32550814242a7cf5f715ca1266e704cd9809e6b5597ff\" returns successfully" Sep 5 23:55:05.440416 kubelet[3339]: I0905 23:55:05.440236 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f77cc885-9q8qs" podStartSLOduration=39.753910705 podStartE2EDuration="49.440214681s" podCreationTimestamp="2025-09-05 23:54:16 +0000 UTC" firstStartedPulling="2025-09-05 23:54:55.254017714 +0000 UTC m=+58.835080337" lastFinishedPulling="2025-09-05 23:55:04.940321702 +0000 UTC m=+68.521384313" observedRunningTime="2025-09-05 23:55:05.440009517 +0000 UTC m=+69.021072164" watchObservedRunningTime="2025-09-05 23:55:05.440214681 +0000 UTC m=+69.021277304" Sep 5 23:55:05.578862 systemd[1]: Started sshd@10-172.31.18.129:22-139.178.68.195:38878.service - OpenSSH per-connection server daemon (139.178.68.195:38878). Sep 5 23:55:05.771021 sshd[6384]: Accepted publickey for core from 139.178.68.195 port 38878 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:05.774799 sshd[6384]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:05.783756 systemd-logind[1997]: New session 11 of user core. Sep 5 23:55:05.792829 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 5 23:55:06.078889 sshd[6384]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:06.087213 systemd[1]: sshd@10-172.31.18.129:22-139.178.68.195:38878.service: Deactivated successfully. Sep 5 23:55:06.092207 systemd[1]: session-11.scope: Deactivated successfully. Sep 5 23:55:06.097963 systemd-logind[1997]: Session 11 logged out. Waiting for processes to exit. Sep 5 23:55:06.102044 systemd-logind[1997]: Removed session 11. Sep 5 23:55:06.426649 kubelet[3339]: I0905 23:55:06.426472 3339 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:55:07.433447 kubelet[3339]: I0905 23:55:07.433308 3339 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:55:08.685701 containerd[2021]: time="2025-09-05T23:55:08.685567213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:08.688325 containerd[2021]: time="2025-09-05T23:55:08.688272781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 5 23:55:08.691561 containerd[2021]: time="2025-09-05T23:55:08.690501853Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:08.698475 containerd[2021]: time="2025-09-05T23:55:08.698418865Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.756534535s" Sep 5 23:55:08.698791 containerd[2021]: time="2025-09-05T23:55:08.698760253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 5 23:55:08.699015 containerd[2021]: time="2025-09-05T23:55:08.698717029Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:08.702571 containerd[2021]: time="2025-09-05T23:55:08.702496417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 5 23:55:08.743535 containerd[2021]: time="2025-09-05T23:55:08.743464501Z" level=info msg="CreateContainer within sandbox \"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 5 23:55:08.790952 containerd[2021]: time="2025-09-05T23:55:08.790880209Z" level=info msg="CreateContainer within sandbox \"88181d94e492e6c478e061116eb21ad0ad2e3a63e03de900aa5fc9eb0c0c3a56\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"13e7d52befaccceb5f5e76fcea775fb8d1392877946b64cb3c12f57c8a65319f\"" Sep 5 23:55:08.792571 containerd[2021]: time="2025-09-05T23:55:08.792165901Z" level=info msg="StartContainer for \"13e7d52befaccceb5f5e76fcea775fb8d1392877946b64cb3c12f57c8a65319f\"" Sep 5 23:55:08.881045 systemd[1]: Started cri-containerd-13e7d52befaccceb5f5e76fcea775fb8d1392877946b64cb3c12f57c8a65319f.scope - libcontainer container 13e7d52befaccceb5f5e76fcea775fb8d1392877946b64cb3c12f57c8a65319f. Sep 5 23:55:08.999070 containerd[2021]: time="2025-09-05T23:55:08.998941346Z" level=info msg="StartContainer for \"13e7d52befaccceb5f5e76fcea775fb8d1392877946b64cb3c12f57c8a65319f\" returns successfully" Sep 5 23:55:09.631020 kubelet[3339]: I0905 23:55:09.629329 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5f9f4498b8-wxd6q" podStartSLOduration=29.213623278 podStartE2EDuration="42.629308129s" podCreationTimestamp="2025-09-05 23:54:27 +0000 UTC" firstStartedPulling="2025-09-05 23:54:55.286613998 +0000 UTC m=+58.867676621" lastFinishedPulling="2025-09-05 23:55:08.702298849 +0000 UTC m=+72.283361472" observedRunningTime="2025-09-05 23:55:09.489932893 +0000 UTC m=+73.070995528" watchObservedRunningTime="2025-09-05 23:55:09.629308129 +0000 UTC m=+73.210370740" Sep 5 23:55:10.417948 containerd[2021]: time="2025-09-05T23:55:10.417884617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:10.419504 containerd[2021]: time="2025-09-05T23:55:10.419448829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 5 23:55:10.421593 containerd[2021]: time="2025-09-05T23:55:10.420312529Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:10.425554 containerd[2021]: time="2025-09-05T23:55:10.425470177Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:55:10.426651 containerd[2021]: time="2025-09-05T23:55:10.426584377Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.722999968s" Sep 5 23:55:10.426761 containerd[2021]: time="2025-09-05T23:55:10.426646153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 5 23:55:10.433209 containerd[2021]: time="2025-09-05T23:55:10.433130413Z" level=info msg="CreateContainer within sandbox \"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 5 23:55:10.468705 containerd[2021]: time="2025-09-05T23:55:10.468445514Z" level=info msg="CreateContainer within sandbox \"fb0fd0fa1669a71628466beb3009491f31b77d982434f418dd5a5f8281028083\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"1577faebc392aaf64c4b6b96019ed1e128526facb47657d7b52fafe79c004a8c\"" Sep 5 23:55:10.470718 containerd[2021]: time="2025-09-05T23:55:10.469839866Z" level=info msg="StartContainer for \"1577faebc392aaf64c4b6b96019ed1e128526facb47657d7b52fafe79c004a8c\"" Sep 5 23:55:10.569892 systemd[1]: Started cri-containerd-1577faebc392aaf64c4b6b96019ed1e128526facb47657d7b52fafe79c004a8c.scope - libcontainer container 1577faebc392aaf64c4b6b96019ed1e128526facb47657d7b52fafe79c004a8c. Sep 5 23:55:10.649339 containerd[2021]: time="2025-09-05T23:55:10.649132358Z" level=info msg="StartContainer for \"1577faebc392aaf64c4b6b96019ed1e128526facb47657d7b52fafe79c004a8c\" returns successfully" Sep 5 23:55:10.877451 kubelet[3339]: I0905 23:55:10.877405 3339 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 5 23:55:10.878499 kubelet[3339]: I0905 23:55:10.877466 3339 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 5 23:55:11.119066 systemd[1]: Started sshd@11-172.31.18.129:22-139.178.68.195:58994.service - OpenSSH per-connection server daemon (139.178.68.195:58994). Sep 5 23:55:11.303586 sshd[6547]: Accepted publickey for core from 139.178.68.195 port 58994 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:11.307072 sshd[6547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:11.316974 systemd-logind[1997]: New session 12 of user core. Sep 5 23:55:11.322821 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 5 23:55:11.623874 sshd[6547]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:11.631308 systemd[1]: sshd@11-172.31.18.129:22-139.178.68.195:58994.service: Deactivated successfully. Sep 5 23:55:11.636987 systemd[1]: session-12.scope: Deactivated successfully. Sep 5 23:55:11.639253 systemd-logind[1997]: Session 12 logged out. Waiting for processes to exit. Sep 5 23:55:11.644177 systemd-logind[1997]: Removed session 12. Sep 5 23:55:16.668118 systemd[1]: Started sshd@12-172.31.18.129:22-139.178.68.195:59006.service - OpenSSH per-connection server daemon (139.178.68.195:59006). Sep 5 23:55:16.848653 sshd[6563]: Accepted publickey for core from 139.178.68.195 port 59006 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:16.856943 sshd[6563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:16.878632 systemd-logind[1997]: New session 13 of user core. Sep 5 23:55:16.883857 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 5 23:55:17.149704 sshd[6563]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:17.158053 systemd[1]: sshd@12-172.31.18.129:22-139.178.68.195:59006.service: Deactivated successfully. Sep 5 23:55:17.164456 systemd[1]: session-13.scope: Deactivated successfully. Sep 5 23:55:17.166657 systemd-logind[1997]: Session 13 logged out. Waiting for processes to exit. Sep 5 23:55:17.168509 systemd-logind[1997]: Removed session 13. Sep 5 23:55:17.186454 systemd[1]: Started sshd@13-172.31.18.129:22-139.178.68.195:59008.service - OpenSSH per-connection server daemon (139.178.68.195:59008). Sep 5 23:55:17.377952 sshd[6576]: Accepted publickey for core from 139.178.68.195 port 59008 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:17.380802 sshd[6576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:17.391723 systemd-logind[1997]: New session 14 of user core. Sep 5 23:55:17.396796 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 5 23:55:17.753739 sshd[6576]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:17.762990 systemd[1]: sshd@13-172.31.18.129:22-139.178.68.195:59008.service: Deactivated successfully. Sep 5 23:55:17.768405 systemd[1]: session-14.scope: Deactivated successfully. Sep 5 23:55:17.773193 systemd-logind[1997]: Session 14 logged out. Waiting for processes to exit. Sep 5 23:55:17.802738 systemd[1]: Started sshd@14-172.31.18.129:22-139.178.68.195:59024.service - OpenSSH per-connection server daemon (139.178.68.195:59024). Sep 5 23:55:17.804661 systemd-logind[1997]: Removed session 14. Sep 5 23:55:17.984866 sshd[6588]: Accepted publickey for core from 139.178.68.195 port 59024 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:17.987735 sshd[6588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:17.996676 systemd-logind[1997]: New session 15 of user core. Sep 5 23:55:18.004856 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 5 23:55:18.313387 sshd[6588]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:18.322736 systemd-logind[1997]: Session 15 logged out. Waiting for processes to exit. Sep 5 23:55:18.323771 systemd[1]: sshd@14-172.31.18.129:22-139.178.68.195:59024.service: Deactivated successfully. Sep 5 23:55:18.332621 systemd[1]: session-15.scope: Deactivated successfully. Sep 5 23:55:18.337564 systemd-logind[1997]: Removed session 15. Sep 5 23:55:23.362236 systemd[1]: Started sshd@15-172.31.18.129:22-139.178.68.195:57988.service - OpenSSH per-connection server daemon (139.178.68.195:57988). Sep 5 23:55:23.547367 sshd[6605]: Accepted publickey for core from 139.178.68.195 port 57988 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:23.550378 sshd[6605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:23.561546 systemd-logind[1997]: New session 16 of user core. Sep 5 23:55:23.567848 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 5 23:55:23.841922 sshd[6605]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:23.850438 systemd-logind[1997]: Session 16 logged out. Waiting for processes to exit. Sep 5 23:55:23.851384 systemd[1]: sshd@15-172.31.18.129:22-139.178.68.195:57988.service: Deactivated successfully. Sep 5 23:55:23.855478 systemd[1]: session-16.scope: Deactivated successfully. Sep 5 23:55:23.863043 systemd-logind[1997]: Removed session 16. Sep 5 23:55:24.593976 kubelet[3339]: I0905 23:55:24.593766 3339 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:55:24.634106 kubelet[3339]: I0905 23:55:24.633656 3339 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xr256" podStartSLOduration=41.143527138 podStartE2EDuration="57.63362914s" podCreationTimestamp="2025-09-05 23:54:27 +0000 UTC" firstStartedPulling="2025-09-05 23:54:53.938730983 +0000 UTC m=+57.519793606" lastFinishedPulling="2025-09-05 23:55:10.428832985 +0000 UTC m=+74.009895608" observedRunningTime="2025-09-05 23:55:11.489985563 +0000 UTC m=+75.071048210" watchObservedRunningTime="2025-09-05 23:55:24.63362914 +0000 UTC m=+88.214691775" Sep 5 23:55:27.480595 systemd[1]: run-containerd-runc-k8s.io-9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b-runc.lpKWto.mount: Deactivated successfully. Sep 5 23:55:28.879208 systemd[1]: Started sshd@16-172.31.18.129:22-139.178.68.195:57996.service - OpenSSH per-connection server daemon (139.178.68.195:57996). Sep 5 23:55:29.087412 sshd[6648]: Accepted publickey for core from 139.178.68.195 port 57996 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:29.091801 sshd[6648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:29.108721 systemd-logind[1997]: New session 17 of user core. Sep 5 23:55:29.117849 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 5 23:55:29.510662 sshd[6648]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:29.525310 systemd[1]: sshd@16-172.31.18.129:22-139.178.68.195:57996.service: Deactivated successfully. Sep 5 23:55:29.536686 systemd[1]: session-17.scope: Deactivated successfully. Sep 5 23:55:29.539653 systemd-logind[1997]: Session 17 logged out. Waiting for processes to exit. Sep 5 23:55:29.544677 systemd-logind[1997]: Removed session 17. Sep 5 23:55:34.556768 systemd[1]: Started sshd@17-172.31.18.129:22-139.178.68.195:37404.service - OpenSSH per-connection server daemon (139.178.68.195:37404). Sep 5 23:55:34.731728 sshd[6662]: Accepted publickey for core from 139.178.68.195 port 37404 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:34.734379 sshd[6662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:34.743337 systemd-logind[1997]: New session 18 of user core. Sep 5 23:55:34.750819 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 5 23:55:35.010887 sshd[6662]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:35.017272 systemd[1]: sshd@17-172.31.18.129:22-139.178.68.195:37404.service: Deactivated successfully. Sep 5 23:55:35.022581 systemd[1]: session-18.scope: Deactivated successfully. Sep 5 23:55:35.024148 systemd-logind[1997]: Session 18 logged out. Waiting for processes to exit. Sep 5 23:55:35.028308 systemd-logind[1997]: Removed session 18. Sep 5 23:55:40.056093 systemd[1]: Started sshd@18-172.31.18.129:22-139.178.68.195:46224.service - OpenSSH per-connection server daemon (139.178.68.195:46224). Sep 5 23:55:40.240640 sshd[6739]: Accepted publickey for core from 139.178.68.195 port 46224 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:40.243259 sshd[6739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:40.251678 systemd-logind[1997]: New session 19 of user core. Sep 5 23:55:40.257823 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 5 23:55:40.619205 sshd[6739]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:40.627160 systemd[1]: sshd@18-172.31.18.129:22-139.178.68.195:46224.service: Deactivated successfully. Sep 5 23:55:40.632218 systemd[1]: session-19.scope: Deactivated successfully. Sep 5 23:55:40.640024 systemd-logind[1997]: Session 19 logged out. Waiting for processes to exit. Sep 5 23:55:40.660254 systemd-logind[1997]: Removed session 19. Sep 5 23:55:40.670702 systemd[1]: Started sshd@19-172.31.18.129:22-139.178.68.195:46238.service - OpenSSH per-connection server daemon (139.178.68.195:46238). Sep 5 23:55:40.881986 sshd[6752]: Accepted publickey for core from 139.178.68.195 port 46238 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:40.885685 sshd[6752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:40.895511 systemd-logind[1997]: New session 20 of user core. Sep 5 23:55:40.903883 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 5 23:55:41.599901 sshd[6752]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:41.607656 systemd[1]: sshd@19-172.31.18.129:22-139.178.68.195:46238.service: Deactivated successfully. Sep 5 23:55:41.613624 systemd[1]: session-20.scope: Deactivated successfully. Sep 5 23:55:41.616658 systemd-logind[1997]: Session 20 logged out. Waiting for processes to exit. Sep 5 23:55:41.636660 systemd-logind[1997]: Removed session 20. Sep 5 23:55:41.649713 systemd[1]: Started sshd@20-172.31.18.129:22-139.178.68.195:46242.service - OpenSSH per-connection server daemon (139.178.68.195:46242). Sep 5 23:55:41.836413 sshd[6763]: Accepted publickey for core from 139.178.68.195 port 46242 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:41.840623 sshd[6763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:41.852017 systemd-logind[1997]: New session 21 of user core. Sep 5 23:55:41.863851 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 5 23:55:45.932239 update_engine[1998]: I20250905 23:55:45.930689 1998 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 5 23:55:45.932239 update_engine[1998]: I20250905 23:55:45.930771 1998 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 5 23:55:45.932239 update_engine[1998]: I20250905 23:55:45.931252 1998 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 5 23:55:45.932239 update_engine[1998]: I20250905 23:55:45.932133 1998 omaha_request_params.cc:62] Current group set to lts Sep 5 23:55:45.935027 update_engine[1998]: I20250905 23:55:45.934964 1998 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 5 23:55:45.935225 update_engine[1998]: I20250905 23:55:45.935191 1998 update_attempter.cc:643] Scheduling an action processor start. Sep 5 23:55:45.935427 locksmithd[2041]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 5 23:55:45.938187 update_engine[1998]: I20250905 23:55:45.937588 1998 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 5 23:55:45.938187 update_engine[1998]: I20250905 23:55:45.937704 1998 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 5 23:55:45.938187 update_engine[1998]: I20250905 23:55:45.937829 1998 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 5 23:55:45.938187 update_engine[1998]: I20250905 23:55:45.937848 1998 omaha_request_action.cc:272] Request: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: Sep 5 23:55:45.938187 update_engine[1998]: I20250905 23:55:45.937866 1998 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 5 23:55:45.953556 update_engine[1998]: I20250905 23:55:45.951104 1998 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 5 23:55:45.953556 update_engine[1998]: I20250905 23:55:45.951711 1998 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 5 23:55:45.995917 update_engine[1998]: E20250905 23:55:45.995848 1998 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 5 23:55:45.996550 update_engine[1998]: I20250905 23:55:45.996168 1998 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 5 23:55:46.417842 sshd[6763]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:46.425883 systemd-logind[1997]: Session 21 logged out. Waiting for processes to exit. Sep 5 23:55:46.427490 systemd[1]: sshd@20-172.31.18.129:22-139.178.68.195:46242.service: Deactivated successfully. Sep 5 23:55:46.435624 systemd[1]: session-21.scope: Deactivated successfully. Sep 5 23:55:46.435988 systemd[1]: session-21.scope: Consumed 1.167s CPU time. Sep 5 23:55:46.530704 systemd[1]: Started sshd@21-172.31.18.129:22-139.178.68.195:46258.service - OpenSSH per-connection server daemon (139.178.68.195:46258). Sep 5 23:55:46.533364 systemd-logind[1997]: Removed session 21. Sep 5 23:55:46.724835 sshd[6784]: Accepted publickey for core from 139.178.68.195 port 46258 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:46.727151 sshd[6784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:46.737205 systemd-logind[1997]: New session 22 of user core. Sep 5 23:55:46.747825 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 5 23:55:47.528882 sshd[6784]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:47.541006 systemd-logind[1997]: Session 22 logged out. Waiting for processes to exit. Sep 5 23:55:47.543160 systemd[1]: sshd@21-172.31.18.129:22-139.178.68.195:46258.service: Deactivated successfully. Sep 5 23:55:47.553211 systemd[1]: session-22.scope: Deactivated successfully. Sep 5 23:55:47.585021 systemd[1]: Started sshd@22-172.31.18.129:22-139.178.68.195:46272.service - OpenSSH per-connection server daemon (139.178.68.195:46272). Sep 5 23:55:47.587441 systemd-logind[1997]: Removed session 22. Sep 5 23:55:47.794279 sshd[6797]: Accepted publickey for core from 139.178.68.195 port 46272 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:47.797070 sshd[6797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:47.812632 systemd-logind[1997]: New session 23 of user core. Sep 5 23:55:47.817828 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 5 23:55:48.145168 sshd[6797]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:48.152648 systemd[1]: sshd@22-172.31.18.129:22-139.178.68.195:46272.service: Deactivated successfully. Sep 5 23:55:48.157470 systemd[1]: session-23.scope: Deactivated successfully. Sep 5 23:55:48.168032 systemd-logind[1997]: Session 23 logged out. Waiting for processes to exit. Sep 5 23:55:48.172660 systemd-logind[1997]: Removed session 23. Sep 5 23:55:53.186073 systemd[1]: Started sshd@23-172.31.18.129:22-139.178.68.195:44842.service - OpenSSH per-connection server daemon (139.178.68.195:44842). Sep 5 23:55:53.381444 sshd[6810]: Accepted publickey for core from 139.178.68.195 port 44842 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:53.384354 sshd[6810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:53.397346 systemd-logind[1997]: New session 24 of user core. Sep 5 23:55:53.403032 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 5 23:55:53.772887 sshd[6810]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:53.781956 systemd[1]: sshd@23-172.31.18.129:22-139.178.68.195:44842.service: Deactivated successfully. Sep 5 23:55:53.786928 systemd[1]: session-24.scope: Deactivated successfully. Sep 5 23:55:53.795710 systemd-logind[1997]: Session 24 logged out. Waiting for processes to exit. Sep 5 23:55:53.799148 systemd-logind[1997]: Removed session 24. Sep 5 23:55:55.928660 update_engine[1998]: I20250905 23:55:55.928567 1998 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 5 23:55:55.929338 update_engine[1998]: I20250905 23:55:55.928928 1998 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 5 23:55:55.929338 update_engine[1998]: I20250905 23:55:55.929224 1998 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 5 23:55:55.934064 update_engine[1998]: E20250905 23:55:55.933979 1998 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 5 23:55:55.934184 update_engine[1998]: I20250905 23:55:55.934098 1998 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 5 23:55:57.481992 systemd[1]: run-containerd-runc-k8s.io-9e16f9bf37a4d1c84d54611020423880706f5a22f6780ea32832c9058820b91b-runc.9ISkUc.mount: Deactivated successfully. Sep 5 23:55:58.823186 systemd[1]: Started sshd@24-172.31.18.129:22-139.178.68.195:44848.service - OpenSSH per-connection server daemon (139.178.68.195:44848). Sep 5 23:55:59.020793 sshd[6871]: Accepted publickey for core from 139.178.68.195 port 44848 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:55:59.025948 sshd[6871]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:55:59.039879 systemd-logind[1997]: New session 25 of user core. Sep 5 23:55:59.045833 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 5 23:55:59.324145 sshd[6871]: pam_unix(sshd:session): session closed for user core Sep 5 23:55:59.334133 systemd[1]: sshd@24-172.31.18.129:22-139.178.68.195:44848.service: Deactivated successfully. Sep 5 23:55:59.344118 systemd[1]: session-25.scope: Deactivated successfully. Sep 5 23:55:59.347331 systemd-logind[1997]: Session 25 logged out. Waiting for processes to exit. Sep 5 23:55:59.351646 systemd-logind[1997]: Removed session 25. Sep 5 23:56:04.371121 systemd[1]: Started sshd@25-172.31.18.129:22-139.178.68.195:42974.service - OpenSSH per-connection server daemon (139.178.68.195:42974). Sep 5 23:56:04.553548 sshd[6886]: Accepted publickey for core from 139.178.68.195 port 42974 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:56:04.556818 sshd[6886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:56:04.565454 systemd-logind[1997]: New session 26 of user core. Sep 5 23:56:04.577236 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 5 23:56:04.869834 sshd[6886]: pam_unix(sshd:session): session closed for user core Sep 5 23:56:04.877911 systemd-logind[1997]: Session 26 logged out. Waiting for processes to exit. Sep 5 23:56:04.879190 systemd[1]: sshd@25-172.31.18.129:22-139.178.68.195:42974.service: Deactivated successfully. Sep 5 23:56:04.884838 systemd[1]: session-26.scope: Deactivated successfully. Sep 5 23:56:04.890818 systemd-logind[1997]: Removed session 26. Sep 5 23:56:05.928379 update_engine[1998]: I20250905 23:56:05.927650 1998 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 5 23:56:05.928379 update_engine[1998]: I20250905 23:56:05.927989 1998 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 5 23:56:05.928379 update_engine[1998]: I20250905 23:56:05.928290 1998 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 5 23:56:05.929642 update_engine[1998]: E20250905 23:56:05.929569 1998 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 5 23:56:05.929936 update_engine[1998]: I20250905 23:56:05.929875 1998 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 5 23:56:06.716598 systemd[1]: run-containerd-runc-k8s.io-226d93cbd3edfd886e7d142cf687f184f82a09e3b9813dd1be0e993b378a5adf-runc.ZiPrTR.mount: Deactivated successfully. Sep 5 23:56:09.912725 systemd[1]: Started sshd@26-172.31.18.129:22-139.178.68.195:42984.service - OpenSSH per-connection server daemon (139.178.68.195:42984). Sep 5 23:56:10.097836 sshd[6945]: Accepted publickey for core from 139.178.68.195 port 42984 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:56:10.100387 sshd[6945]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:56:10.108885 systemd-logind[1997]: New session 27 of user core. Sep 5 23:56:10.117186 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 5 23:56:10.396208 sshd[6945]: pam_unix(sshd:session): session closed for user core Sep 5 23:56:10.406999 systemd[1]: sshd@26-172.31.18.129:22-139.178.68.195:42984.service: Deactivated successfully. Sep 5 23:56:10.414460 systemd[1]: session-27.scope: Deactivated successfully. Sep 5 23:56:10.417171 systemd-logind[1997]: Session 27 logged out. Waiting for processes to exit. Sep 5 23:56:10.420846 systemd-logind[1997]: Removed session 27. Sep 5 23:56:15.440058 systemd[1]: Started sshd@27-172.31.18.129:22-139.178.68.195:35982.service - OpenSSH per-connection server daemon (139.178.68.195:35982). Sep 5 23:56:15.633685 sshd[6959]: Accepted publickey for core from 139.178.68.195 port 35982 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:56:15.635389 sshd[6959]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:56:15.650156 systemd-logind[1997]: New session 28 of user core. Sep 5 23:56:15.657857 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 5 23:56:15.929430 update_engine[1998]: I20250905 23:56:15.928573 1998 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 5 23:56:15.929430 update_engine[1998]: I20250905 23:56:15.928938 1998 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 5 23:56:15.929430 update_engine[1998]: I20250905 23:56:15.929287 1998 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 5 23:56:15.932334 update_engine[1998]: E20250905 23:56:15.932241 1998 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932626 1998 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932662 1998 omaha_request_action.cc:617] Omaha request response: Sep 5 23:56:15.936245 update_engine[1998]: E20250905 23:56:15.932787 1998 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932827 1998 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932847 1998 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932861 1998 update_attempter.cc:306] Processing Done. Sep 5 23:56:15.936245 update_engine[1998]: E20250905 23:56:15.932890 1998 update_attempter.cc:619] Update failed. Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932909 1998 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932926 1998 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.932945 1998 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.933066 1998 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.933107 1998 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 5 23:56:15.936245 update_engine[1998]: I20250905 23:56:15.933125 1998 omaha_request_action.cc:272] Request: Sep 5 23:56:15.936245 update_engine[1998]: Sep 5 23:56:15.936245 update_engine[1998]: Sep 5 23:56:15.936245 update_engine[1998]: Sep 5 23:56:15.937227 update_engine[1998]: Sep 5 23:56:15.937227 update_engine[1998]: Sep 5 23:56:15.937227 update_engine[1998]: Sep 5 23:56:15.937227 update_engine[1998]: I20250905 23:56:15.933143 1998 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 5 23:56:15.937227 update_engine[1998]: I20250905 23:56:15.933418 1998 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 5 23:56:15.937227 update_engine[1998]: I20250905 23:56:15.935860 1998 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 5 23:56:15.938563 locksmithd[2041]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 5 23:56:15.938563 locksmithd[2041]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 5 23:56:15.939860 update_engine[1998]: E20250905 23:56:15.937654 1998 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937757 1998 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937779 1998 omaha_request_action.cc:617] Omaha request response: Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937798 1998 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937815 1998 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937831 1998 update_attempter.cc:306] Processing Done. Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937849 1998 update_attempter.cc:310] Error event sent. Sep 5 23:56:15.939860 update_engine[1998]: I20250905 23:56:15.937877 1998 update_check_scheduler.cc:74] Next update check in 43m17s Sep 5 23:56:15.972841 sshd[6959]: pam_unix(sshd:session): session closed for user core Sep 5 23:56:15.983408 systemd[1]: sshd@27-172.31.18.129:22-139.178.68.195:35982.service: Deactivated successfully. Sep 5 23:56:15.989771 systemd[1]: session-28.scope: Deactivated successfully. Sep 5 23:56:15.994162 systemd-logind[1997]: Session 28 logged out. Waiting for processes to exit. Sep 5 23:56:15.997232 systemd-logind[1997]: Removed session 28. Sep 5 23:56:21.016488 systemd[1]: Started sshd@28-172.31.18.129:22-139.178.68.195:47034.service - OpenSSH per-connection server daemon (139.178.68.195:47034). Sep 5 23:56:21.212309 sshd[6987]: Accepted publickey for core from 139.178.68.195 port 47034 ssh2: RSA SHA256:vADW7QTWQ4wuHdKF8jUL6KxfiBYQUAY2qUkO4wqdhJM Sep 5 23:56:21.215741 sshd[6987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:56:21.225169 systemd-logind[1997]: New session 29 of user core. Sep 5 23:56:21.236220 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 5 23:56:21.533723 sshd[6987]: pam_unix(sshd:session): session closed for user core Sep 5 23:56:21.539093 systemd-logind[1997]: Session 29 logged out. Waiting for processes to exit. Sep 5 23:56:21.544287 systemd[1]: sshd@28-172.31.18.129:22-139.178.68.195:47034.service: Deactivated successfully. Sep 5 23:56:21.550347 systemd[1]: session-29.scope: Deactivated successfully. Sep 5 23:56:21.556280 systemd-logind[1997]: Removed session 29. Sep 5 23:56:35.590370 systemd[1]: cri-containerd-5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15.scope: Deactivated successfully. Sep 5 23:56:35.591935 systemd[1]: cri-containerd-5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15.scope: Consumed 28.461s CPU time. Sep 5 23:56:35.651470 containerd[2021]: time="2025-09-05T23:56:35.650771845Z" level=info msg="shim disconnected" id=5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15 namespace=k8s.io Sep 5 23:56:35.651470 containerd[2021]: time="2025-09-05T23:56:35.650871997Z" level=warning msg="cleaning up after shim disconnected" id=5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15 namespace=k8s.io Sep 5 23:56:35.651470 containerd[2021]: time="2025-09-05T23:56:35.650896429Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:56:35.658507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15-rootfs.mount: Deactivated successfully. Sep 5 23:56:35.760668 systemd[1]: cri-containerd-cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84.scope: Deactivated successfully. Sep 5 23:56:35.761187 systemd[1]: cri-containerd-cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84.scope: Consumed 5.604s CPU time, 19.7M memory peak, 0B memory swap peak. Sep 5 23:56:35.793701 kubelet[3339]: I0905 23:56:35.791757 3339 scope.go:117] "RemoveContainer" containerID="5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15" Sep 5 23:56:35.797226 containerd[2021]: time="2025-09-05T23:56:35.797015257Z" level=info msg="CreateContainer within sandbox \"e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 5 23:56:35.829542 containerd[2021]: time="2025-09-05T23:56:35.828262730Z" level=info msg="shim disconnected" id=cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84 namespace=k8s.io Sep 5 23:56:35.829542 containerd[2021]: time="2025-09-05T23:56:35.828389726Z" level=warning msg="cleaning up after shim disconnected" id=cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84 namespace=k8s.io Sep 5 23:56:35.829542 containerd[2021]: time="2025-09-05T23:56:35.828413726Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:56:35.833611 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84-rootfs.mount: Deactivated successfully. Sep 5 23:56:35.843079 containerd[2021]: time="2025-09-05T23:56:35.840336026Z" level=info msg="CreateContainer within sandbox \"e1a78c6af4c1c460e87ead1aac6aa097802849f4cf793d2cbe38b8139cd70f73\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1\"" Sep 5 23:56:35.843079 containerd[2021]: time="2025-09-05T23:56:35.842139494Z" level=info msg="StartContainer for \"50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1\"" Sep 5 23:56:35.919861 systemd[1]: Started cri-containerd-50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1.scope - libcontainer container 50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1. Sep 5 23:56:35.971173 containerd[2021]: time="2025-09-05T23:56:35.971103158Z" level=info msg="StartContainer for \"50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1\" returns successfully" Sep 5 23:56:36.803363 kubelet[3339]: I0905 23:56:36.803292 3339 scope.go:117] "RemoveContainer" containerID="cf491c778f449126347361774265b862e053f56ce2cb4e3191d62721854afc84" Sep 5 23:56:36.809420 containerd[2021]: time="2025-09-05T23:56:36.809239898Z" level=info msg="CreateContainer within sandbox \"418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 5 23:56:36.846127 containerd[2021]: time="2025-09-05T23:56:36.845961975Z" level=info msg="CreateContainer within sandbox \"418bc25fb6c33d2b868488c7fe5af49d94c0ec04d979c5e8b5c15278f411e23d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ed7116a9151545b9e8efcd0695bf54fcf364d7cdb501b7c826c5ec3d3bb1e5e6\"" Sep 5 23:56:36.847586 containerd[2021]: time="2025-09-05T23:56:36.846941679Z" level=info msg="StartContainer for \"ed7116a9151545b9e8efcd0695bf54fcf364d7cdb501b7c826c5ec3d3bb1e5e6\"" Sep 5 23:56:36.906859 systemd[1]: Started cri-containerd-ed7116a9151545b9e8efcd0695bf54fcf364d7cdb501b7c826c5ec3d3bb1e5e6.scope - libcontainer container ed7116a9151545b9e8efcd0695bf54fcf364d7cdb501b7c826c5ec3d3bb1e5e6. Sep 5 23:56:36.979131 containerd[2021]: time="2025-09-05T23:56:36.979061751Z" level=info msg="StartContainer for \"ed7116a9151545b9e8efcd0695bf54fcf364d7cdb501b7c826c5ec3d3bb1e5e6\" returns successfully" Sep 5 23:56:39.919053 kubelet[3339]: E0905 23:56:39.918980 3339 controller.go:195] "Failed to update lease" err="Put \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 5 23:56:40.857401 systemd[1]: cri-containerd-f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7.scope: Deactivated successfully. Sep 5 23:56:40.857891 systemd[1]: cri-containerd-f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7.scope: Consumed 3.286s CPU time, 16.3M memory peak, 0B memory swap peak. Sep 5 23:56:40.899824 containerd[2021]: time="2025-09-05T23:56:40.899698999Z" level=info msg="shim disconnected" id=f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7 namespace=k8s.io Sep 5 23:56:40.901773 containerd[2021]: time="2025-09-05T23:56:40.900508699Z" level=warning msg="cleaning up after shim disconnected" id=f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7 namespace=k8s.io Sep 5 23:56:40.901773 containerd[2021]: time="2025-09-05T23:56:40.900588331Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:56:40.906423 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7-rootfs.mount: Deactivated successfully. Sep 5 23:56:41.830147 kubelet[3339]: I0905 23:56:41.830103 3339 scope.go:117] "RemoveContainer" containerID="f354ff90bc91ce4426e0250a1d6edeb9d4b64f14245090046ada83f2bd1315e7" Sep 5 23:56:41.833681 containerd[2021]: time="2025-09-05T23:56:41.833624395Z" level=info msg="CreateContainer within sandbox \"5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 5 23:56:41.864753 containerd[2021]: time="2025-09-05T23:56:41.864578816Z" level=info msg="CreateContainer within sandbox \"5655339c6b5762c1447b80e5e75589c3ce40dd757844fce9795aa0b179be7584\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"5a64386a90c75e46250d27078859c10f77a4f60541d418e65fcbd10615907fb8\"" Sep 5 23:56:41.865736 containerd[2021]: time="2025-09-05T23:56:41.865688324Z" level=info msg="StartContainer for \"5a64386a90c75e46250d27078859c10f77a4f60541d418e65fcbd10615907fb8\"" Sep 5 23:56:41.947830 systemd[1]: Started cri-containerd-5a64386a90c75e46250d27078859c10f77a4f60541d418e65fcbd10615907fb8.scope - libcontainer container 5a64386a90c75e46250d27078859c10f77a4f60541d418e65fcbd10615907fb8. Sep 5 23:56:42.010666 containerd[2021]: time="2025-09-05T23:56:42.010537672Z" level=info msg="StartContainer for \"5a64386a90c75e46250d27078859c10f77a4f60541d418e65fcbd10615907fb8\" returns successfully" Sep 5 23:56:47.447770 systemd[1]: cri-containerd-50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1.scope: Deactivated successfully. Sep 5 23:56:47.492945 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1-rootfs.mount: Deactivated successfully. Sep 5 23:56:47.497596 containerd[2021]: time="2025-09-05T23:56:47.497246375Z" level=info msg="shim disconnected" id=50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1 namespace=k8s.io Sep 5 23:56:47.497596 containerd[2021]: time="2025-09-05T23:56:47.497320379Z" level=warning msg="cleaning up after shim disconnected" id=50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1 namespace=k8s.io Sep 5 23:56:47.497596 containerd[2021]: time="2025-09-05T23:56:47.497340527Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:56:47.853210 kubelet[3339]: I0905 23:56:47.852565 3339 scope.go:117] "RemoveContainer" containerID="5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15" Sep 5 23:56:47.853210 kubelet[3339]: I0905 23:56:47.852971 3339 scope.go:117] "RemoveContainer" containerID="50f75f74016f49208853cbf5b929e1c47e1394e51429fe874350c715b13353a1" Sep 5 23:56:47.853210 kubelet[3339]: E0905 23:56:47.853153 3339 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-58fc44c59b-bszf7_tigera-operator(6d1cfcad-ca17-4937-a600-96cb0524911d)\"" pod="tigera-operator/tigera-operator-58fc44c59b-bszf7" podUID="6d1cfcad-ca17-4937-a600-96cb0524911d" Sep 5 23:56:47.856287 containerd[2021]: time="2025-09-05T23:56:47.856204381Z" level=info msg="RemoveContainer for \"5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15\"" Sep 5 23:56:47.863583 containerd[2021]: time="2025-09-05T23:56:47.863019157Z" level=info msg="RemoveContainer for \"5677ed74fc3c12e2473aa1e907d4f3244d11a9b7df6657e1fd1a7864e04f1a15\" returns successfully" Sep 5 23:56:49.919398 kubelet[3339]: E0905 23:56:49.919221 3339 controller.go:195] "Failed to update lease" err="Put \"https://172.31.18.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-18-129?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"