Sep 6 00:08:39.240362 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Sep 6 00:08:39.240407 kernel: Linux version 6.6.103-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 5 22:30:47 -00 2025 Sep 6 00:08:39.240432 kernel: KASLR disabled due to lack of seed Sep 6 00:08:39.240449 kernel: efi: EFI v2.7 by EDK II Sep 6 00:08:39.240465 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7affea98 MEMRESERVE=0x7852ee18 Sep 6 00:08:39.240480 kernel: ACPI: Early table checksum verification disabled Sep 6 00:08:39.240498 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Sep 6 00:08:39.240539 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Sep 6 00:08:39.240557 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 6 00:08:39.240573 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Sep 6 00:08:39.240596 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 6 00:08:39.240612 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Sep 6 00:08:39.240628 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Sep 6 00:08:39.240644 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Sep 6 00:08:39.240663 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 6 00:08:39.240684 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Sep 6 00:08:39.240701 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Sep 6 00:08:39.240718 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Sep 6 00:08:39.240735 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Sep 6 00:08:39.240751 kernel: printk: bootconsole [uart0] enabled Sep 6 00:08:39.240768 kernel: NUMA: Failed to initialise from firmware Sep 6 00:08:39.240785 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Sep 6 00:08:39.240802 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Sep 6 00:08:39.240818 kernel: Zone ranges: Sep 6 00:08:39.240835 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 6 00:08:39.240852 kernel: DMA32 empty Sep 6 00:08:39.240872 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Sep 6 00:08:39.240889 kernel: Movable zone start for each node Sep 6 00:08:39.240905 kernel: Early memory node ranges Sep 6 00:08:39.240922 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Sep 6 00:08:39.240938 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Sep 6 00:08:39.240955 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Sep 6 00:08:39.240971 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Sep 6 00:08:39.240988 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Sep 6 00:08:39.241004 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Sep 6 00:08:39.241021 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Sep 6 00:08:39.241038 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Sep 6 00:08:39.241054 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Sep 6 00:08:39.241075 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Sep 6 00:08:39.241093 kernel: psci: probing for conduit method from ACPI. Sep 6 00:08:39.241117 kernel: psci: PSCIv1.0 detected in firmware. Sep 6 00:08:39.241135 kernel: psci: Using standard PSCI v0.2 function IDs Sep 6 00:08:39.241152 kernel: psci: Trusted OS migration not required Sep 6 00:08:39.241173 kernel: psci: SMC Calling Convention v1.1 Sep 6 00:08:39.241192 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Sep 6 00:08:39.241209 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 6 00:08:39.241227 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 6 00:08:39.241245 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 6 00:08:39.241263 kernel: Detected PIPT I-cache on CPU0 Sep 6 00:08:39.241281 kernel: CPU features: detected: GIC system register CPU interface Sep 6 00:08:39.241298 kernel: CPU features: detected: Spectre-v2 Sep 6 00:08:39.241315 kernel: CPU features: detected: Spectre-v3a Sep 6 00:08:39.241333 kernel: CPU features: detected: Spectre-BHB Sep 6 00:08:39.241350 kernel: CPU features: detected: ARM erratum 1742098 Sep 6 00:08:39.241372 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Sep 6 00:08:39.241390 kernel: alternatives: applying boot alternatives Sep 6 00:08:39.241411 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 6 00:08:39.243612 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 6 00:08:39.243659 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 6 00:08:39.243681 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 6 00:08:39.243700 kernel: Fallback order for Node 0: 0 Sep 6 00:08:39.243718 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Sep 6 00:08:39.243736 kernel: Policy zone: Normal Sep 6 00:08:39.243754 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 6 00:08:39.243771 kernel: software IO TLB: area num 2. Sep 6 00:08:39.243800 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Sep 6 00:08:39.243818 kernel: Memory: 3820088K/4030464K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39424K init, 897K bss, 210376K reserved, 0K cma-reserved) Sep 6 00:08:39.243837 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 6 00:08:39.243855 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 6 00:08:39.243873 kernel: rcu: RCU event tracing is enabled. Sep 6 00:08:39.243891 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 6 00:08:39.243910 kernel: Trampoline variant of Tasks RCU enabled. Sep 6 00:08:39.243927 kernel: Tracing variant of Tasks RCU enabled. Sep 6 00:08:39.243945 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 6 00:08:39.243963 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 6 00:08:39.243981 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 6 00:08:39.244003 kernel: GICv3: 96 SPIs implemented Sep 6 00:08:39.244021 kernel: GICv3: 0 Extended SPIs implemented Sep 6 00:08:39.244039 kernel: Root IRQ handler: gic_handle_irq Sep 6 00:08:39.244057 kernel: GICv3: GICv3 features: 16 PPIs Sep 6 00:08:39.244076 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Sep 6 00:08:39.244094 kernel: ITS [mem 0x10080000-0x1009ffff] Sep 6 00:08:39.244112 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Sep 6 00:08:39.244131 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Sep 6 00:08:39.244149 kernel: GICv3: using LPI property table @0x00000004000d0000 Sep 6 00:08:39.244167 kernel: ITS: Using hypervisor restricted LPI range [128] Sep 6 00:08:39.244184 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Sep 6 00:08:39.244202 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 6 00:08:39.244224 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Sep 6 00:08:39.244242 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Sep 6 00:08:39.244260 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Sep 6 00:08:39.244299 kernel: Console: colour dummy device 80x25 Sep 6 00:08:39.244319 kernel: printk: console [tty1] enabled Sep 6 00:08:39.244338 kernel: ACPI: Core revision 20230628 Sep 6 00:08:39.244357 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Sep 6 00:08:39.244375 kernel: pid_max: default: 32768 minimum: 301 Sep 6 00:08:39.244394 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 6 00:08:39.244433 kernel: landlock: Up and running. Sep 6 00:08:39.244455 kernel: SELinux: Initializing. Sep 6 00:08:39.244473 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 6 00:08:39.244492 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 6 00:08:39.244547 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 6 00:08:39.244574 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 6 00:08:39.244593 kernel: rcu: Hierarchical SRCU implementation. Sep 6 00:08:39.244612 kernel: rcu: Max phase no-delay instances is 400. Sep 6 00:08:39.244631 kernel: Platform MSI: ITS@0x10080000 domain created Sep 6 00:08:39.244655 kernel: PCI/MSI: ITS@0x10080000 domain created Sep 6 00:08:39.244674 kernel: Remapping and enabling EFI services. Sep 6 00:08:39.244692 kernel: smp: Bringing up secondary CPUs ... Sep 6 00:08:39.244710 kernel: Detected PIPT I-cache on CPU1 Sep 6 00:08:39.244728 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Sep 6 00:08:39.244762 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Sep 6 00:08:39.244782 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Sep 6 00:08:39.244800 kernel: smp: Brought up 1 node, 2 CPUs Sep 6 00:08:39.244818 kernel: SMP: Total of 2 processors activated. Sep 6 00:08:39.244836 kernel: CPU features: detected: 32-bit EL0 Support Sep 6 00:08:39.244861 kernel: CPU features: detected: 32-bit EL1 Support Sep 6 00:08:39.244880 kernel: CPU features: detected: CRC32 instructions Sep 6 00:08:39.244910 kernel: CPU: All CPU(s) started at EL1 Sep 6 00:08:39.244932 kernel: alternatives: applying system-wide alternatives Sep 6 00:08:39.244951 kernel: devtmpfs: initialized Sep 6 00:08:39.244970 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 6 00:08:39.244989 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 6 00:08:39.245007 kernel: pinctrl core: initialized pinctrl subsystem Sep 6 00:08:39.245026 kernel: SMBIOS 3.0.0 present. Sep 6 00:08:39.245049 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Sep 6 00:08:39.245067 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 6 00:08:39.245086 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 6 00:08:39.245105 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 6 00:08:39.245124 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 6 00:08:39.245143 kernel: audit: initializing netlink subsys (disabled) Sep 6 00:08:39.245162 kernel: audit: type=2000 audit(0.285:1): state=initialized audit_enabled=0 res=1 Sep 6 00:08:39.245184 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 6 00:08:39.245203 kernel: cpuidle: using governor menu Sep 6 00:08:39.245222 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 6 00:08:39.245240 kernel: ASID allocator initialised with 65536 entries Sep 6 00:08:39.245259 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 6 00:08:39.245277 kernel: Serial: AMBA PL011 UART driver Sep 6 00:08:39.245295 kernel: Modules: 17488 pages in range for non-PLT usage Sep 6 00:08:39.245314 kernel: Modules: 509008 pages in range for PLT usage Sep 6 00:08:39.245333 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 6 00:08:39.245355 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 6 00:08:39.245374 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 6 00:08:39.245393 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 6 00:08:39.245411 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 6 00:08:39.245430 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 6 00:08:39.245448 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 6 00:08:39.245467 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 6 00:08:39.245485 kernel: ACPI: Added _OSI(Module Device) Sep 6 00:08:39.247596 kernel: ACPI: Added _OSI(Processor Device) Sep 6 00:08:39.247661 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 6 00:08:39.247683 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 6 00:08:39.247703 kernel: ACPI: Interpreter enabled Sep 6 00:08:39.247721 kernel: ACPI: Using GIC for interrupt routing Sep 6 00:08:39.247740 kernel: ACPI: MCFG table detected, 1 entries Sep 6 00:08:39.247758 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Sep 6 00:08:39.248048 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 6 00:08:39.248257 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 6 00:08:39.248492 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 6 00:08:39.248715 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Sep 6 00:08:39.248909 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Sep 6 00:08:39.248935 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Sep 6 00:08:39.248955 kernel: acpiphp: Slot [1] registered Sep 6 00:08:39.248973 kernel: acpiphp: Slot [2] registered Sep 6 00:08:39.248992 kernel: acpiphp: Slot [3] registered Sep 6 00:08:39.249010 kernel: acpiphp: Slot [4] registered Sep 6 00:08:39.249035 kernel: acpiphp: Slot [5] registered Sep 6 00:08:39.249054 kernel: acpiphp: Slot [6] registered Sep 6 00:08:39.249072 kernel: acpiphp: Slot [7] registered Sep 6 00:08:39.249103 kernel: acpiphp: Slot [8] registered Sep 6 00:08:39.249125 kernel: acpiphp: Slot [9] registered Sep 6 00:08:39.249144 kernel: acpiphp: Slot [10] registered Sep 6 00:08:39.249163 kernel: acpiphp: Slot [11] registered Sep 6 00:08:39.249181 kernel: acpiphp: Slot [12] registered Sep 6 00:08:39.249199 kernel: acpiphp: Slot [13] registered Sep 6 00:08:39.249218 kernel: acpiphp: Slot [14] registered Sep 6 00:08:39.249242 kernel: acpiphp: Slot [15] registered Sep 6 00:08:39.249260 kernel: acpiphp: Slot [16] registered Sep 6 00:08:39.249279 kernel: acpiphp: Slot [17] registered Sep 6 00:08:39.249297 kernel: acpiphp: Slot [18] registered Sep 6 00:08:39.249315 kernel: acpiphp: Slot [19] registered Sep 6 00:08:39.249333 kernel: acpiphp: Slot [20] registered Sep 6 00:08:39.249352 kernel: acpiphp: Slot [21] registered Sep 6 00:08:39.249370 kernel: acpiphp: Slot [22] registered Sep 6 00:08:39.249388 kernel: acpiphp: Slot [23] registered Sep 6 00:08:39.249410 kernel: acpiphp: Slot [24] registered Sep 6 00:08:39.249429 kernel: acpiphp: Slot [25] registered Sep 6 00:08:39.249447 kernel: acpiphp: Slot [26] registered Sep 6 00:08:39.249465 kernel: acpiphp: Slot [27] registered Sep 6 00:08:39.249484 kernel: acpiphp: Slot [28] registered Sep 6 00:08:39.251861 kernel: acpiphp: Slot [29] registered Sep 6 00:08:39.251911 kernel: acpiphp: Slot [30] registered Sep 6 00:08:39.251931 kernel: acpiphp: Slot [31] registered Sep 6 00:08:39.251950 kernel: PCI host bridge to bus 0000:00 Sep 6 00:08:39.252197 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Sep 6 00:08:39.252435 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 6 00:08:39.252719 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Sep 6 00:08:39.252907 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Sep 6 00:08:39.253181 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Sep 6 00:08:39.253420 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Sep 6 00:08:39.253703 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Sep 6 00:08:39.253926 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Sep 6 00:08:39.254125 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Sep 6 00:08:39.254323 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 6 00:08:39.254553 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Sep 6 00:08:39.254758 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Sep 6 00:08:39.254955 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Sep 6 00:08:39.255190 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Sep 6 00:08:39.255466 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 6 00:08:39.255741 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Sep 6 00:08:39.255946 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Sep 6 00:08:39.256148 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Sep 6 00:08:39.256371 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Sep 6 00:08:39.256636 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Sep 6 00:08:39.256849 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Sep 6 00:08:39.257034 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 6 00:08:39.257216 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Sep 6 00:08:39.257241 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 6 00:08:39.257260 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 6 00:08:39.257279 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 6 00:08:39.257298 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 6 00:08:39.257317 kernel: iommu: Default domain type: Translated Sep 6 00:08:39.257335 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 6 00:08:39.257359 kernel: efivars: Registered efivars operations Sep 6 00:08:39.257377 kernel: vgaarb: loaded Sep 6 00:08:39.257396 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 6 00:08:39.257414 kernel: VFS: Disk quotas dquot_6.6.0 Sep 6 00:08:39.257432 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 6 00:08:39.257453 kernel: pnp: PnP ACPI init Sep 6 00:08:39.257776 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Sep 6 00:08:39.257808 kernel: pnp: PnP ACPI: found 1 devices Sep 6 00:08:39.257836 kernel: NET: Registered PF_INET protocol family Sep 6 00:08:39.257856 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 6 00:08:39.257875 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 6 00:08:39.257895 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 6 00:08:39.257914 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 6 00:08:39.257934 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 6 00:08:39.257953 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 6 00:08:39.257972 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 6 00:08:39.257991 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 6 00:08:39.258016 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 6 00:08:39.258035 kernel: PCI: CLS 0 bytes, default 64 Sep 6 00:08:39.258054 kernel: kvm [1]: HYP mode not available Sep 6 00:08:39.258073 kernel: Initialise system trusted keyrings Sep 6 00:08:39.258094 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 6 00:08:39.258128 kernel: Key type asymmetric registered Sep 6 00:08:39.258168 kernel: Asymmetric key parser 'x509' registered Sep 6 00:08:39.258213 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 6 00:08:39.258258 kernel: io scheduler mq-deadline registered Sep 6 00:08:39.258299 kernel: io scheduler kyber registered Sep 6 00:08:39.258325 kernel: io scheduler bfq registered Sep 6 00:08:39.258627 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Sep 6 00:08:39.258657 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 6 00:08:39.258677 kernel: ACPI: button: Power Button [PWRB] Sep 6 00:08:39.258696 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Sep 6 00:08:39.258715 kernel: ACPI: button: Sleep Button [SLPB] Sep 6 00:08:39.258734 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 6 00:08:39.258760 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 6 00:08:39.259023 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Sep 6 00:08:39.259053 kernel: printk: console [ttyS0] disabled Sep 6 00:08:39.259074 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Sep 6 00:08:39.259093 kernel: printk: console [ttyS0] enabled Sep 6 00:08:39.259112 kernel: printk: bootconsole [uart0] disabled Sep 6 00:08:39.259131 kernel: thunder_xcv, ver 1.0 Sep 6 00:08:39.259150 kernel: thunder_bgx, ver 1.0 Sep 6 00:08:39.259168 kernel: nicpf, ver 1.0 Sep 6 00:08:39.259193 kernel: nicvf, ver 1.0 Sep 6 00:08:39.259412 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 6 00:08:39.259658 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-06T00:08:38 UTC (1757117318) Sep 6 00:08:39.259685 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 6 00:08:39.259705 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Sep 6 00:08:39.259724 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 6 00:08:39.259742 kernel: watchdog: Hard watchdog permanently disabled Sep 6 00:08:39.259761 kernel: NET: Registered PF_INET6 protocol family Sep 6 00:08:39.259785 kernel: Segment Routing with IPv6 Sep 6 00:08:39.259804 kernel: In-situ OAM (IOAM) with IPv6 Sep 6 00:08:39.259823 kernel: NET: Registered PF_PACKET protocol family Sep 6 00:08:39.259841 kernel: Key type dns_resolver registered Sep 6 00:08:39.259859 kernel: registered taskstats version 1 Sep 6 00:08:39.259878 kernel: Loading compiled-in X.509 certificates Sep 6 00:08:39.259897 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.103-flatcar: 5b16e1dfa86dac534548885fd675b87757ff9e20' Sep 6 00:08:39.259916 kernel: Key type .fscrypt registered Sep 6 00:08:39.259934 kernel: Key type fscrypt-provisioning registered Sep 6 00:08:39.259956 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 6 00:08:39.259975 kernel: ima: Allocated hash algorithm: sha1 Sep 6 00:08:39.259994 kernel: ima: No architecture policies found Sep 6 00:08:39.260012 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 6 00:08:39.260031 kernel: clk: Disabling unused clocks Sep 6 00:08:39.260050 kernel: Freeing unused kernel memory: 39424K Sep 6 00:08:39.260068 kernel: Run /init as init process Sep 6 00:08:39.260087 kernel: with arguments: Sep 6 00:08:39.260105 kernel: /init Sep 6 00:08:39.260123 kernel: with environment: Sep 6 00:08:39.260145 kernel: HOME=/ Sep 6 00:08:39.260163 kernel: TERM=linux Sep 6 00:08:39.260181 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 6 00:08:39.260204 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 6 00:08:39.260227 systemd[1]: Detected virtualization amazon. Sep 6 00:08:39.260248 systemd[1]: Detected architecture arm64. Sep 6 00:08:39.260285 systemd[1]: Running in initrd. Sep 6 00:08:39.260315 systemd[1]: No hostname configured, using default hostname. Sep 6 00:08:39.260335 systemd[1]: Hostname set to . Sep 6 00:08:39.260356 systemd[1]: Initializing machine ID from VM UUID. Sep 6 00:08:39.260376 systemd[1]: Queued start job for default target initrd.target. Sep 6 00:08:39.260396 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 00:08:39.260417 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 00:08:39.260438 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 6 00:08:39.260458 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 6 00:08:39.260483 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 6 00:08:39.260519 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 6 00:08:39.260548 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 6 00:08:39.260569 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 6 00:08:39.260589 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 00:08:39.260609 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 6 00:08:39.260630 systemd[1]: Reached target paths.target - Path Units. Sep 6 00:08:39.260655 systemd[1]: Reached target slices.target - Slice Units. Sep 6 00:08:39.260676 systemd[1]: Reached target swap.target - Swaps. Sep 6 00:08:39.260709 systemd[1]: Reached target timers.target - Timer Units. Sep 6 00:08:39.260732 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 6 00:08:39.260752 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 6 00:08:39.260773 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 6 00:08:39.260793 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 6 00:08:39.260813 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 6 00:08:39.260833 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 6 00:08:39.260859 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 00:08:39.260879 systemd[1]: Reached target sockets.target - Socket Units. Sep 6 00:08:39.260899 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 6 00:08:39.260920 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 6 00:08:39.260940 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 6 00:08:39.260960 systemd[1]: Starting systemd-fsck-usr.service... Sep 6 00:08:39.260980 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 6 00:08:39.261000 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 6 00:08:39.261024 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 00:08:39.261045 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 6 00:08:39.261065 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 00:08:39.261085 systemd[1]: Finished systemd-fsck-usr.service. Sep 6 00:08:39.261144 systemd-journald[251]: Collecting audit messages is disabled. Sep 6 00:08:39.261193 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 6 00:08:39.261215 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 00:08:39.261235 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 6 00:08:39.261258 systemd-journald[251]: Journal started Sep 6 00:08:39.261296 systemd-journald[251]: Runtime Journal (/run/log/journal/ec2292502920437d3078a6b59afc3845) is 8.0M, max 75.3M, 67.3M free. Sep 6 00:08:39.213186 systemd-modules-load[252]: Inserted module 'overlay' Sep 6 00:08:39.270173 systemd[1]: Started systemd-journald.service - Journal Service. Sep 6 00:08:39.273677 systemd-modules-load[252]: Inserted module 'br_netfilter' Sep 6 00:08:39.275752 kernel: Bridge firewalling registered Sep 6 00:08:39.279347 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 6 00:08:39.283644 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 00:08:39.295853 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 6 00:08:39.304742 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 6 00:08:39.308165 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 6 00:08:39.321939 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 6 00:08:39.366578 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 6 00:08:39.372499 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 00:08:39.374567 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 00:08:39.389856 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 6 00:08:39.400852 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 00:08:39.412843 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 6 00:08:39.446539 dracut-cmdline[289]: dracut-dracut-053 Sep 6 00:08:39.450354 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 6 00:08:39.488675 systemd-resolved[284]: Positive Trust Anchors: Sep 6 00:08:39.490726 systemd-resolved[284]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 6 00:08:39.490793 systemd-resolved[284]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 6 00:08:39.596547 kernel: SCSI subsystem initialized Sep 6 00:08:39.603530 kernel: Loading iSCSI transport class v2.0-870. Sep 6 00:08:39.616566 kernel: iscsi: registered transport (tcp) Sep 6 00:08:39.639828 kernel: iscsi: registered transport (qla4xxx) Sep 6 00:08:39.639907 kernel: QLogic iSCSI HBA Driver Sep 6 00:08:39.714825 kernel: random: crng init done Sep 6 00:08:39.715124 systemd-resolved[284]: Defaulting to hostname 'linux'. Sep 6 00:08:39.719025 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 6 00:08:39.721759 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 6 00:08:39.744600 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 6 00:08:39.756932 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 6 00:08:39.791154 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 6 00:08:39.791244 kernel: device-mapper: uevent: version 1.0.3 Sep 6 00:08:39.793092 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 6 00:08:39.874525 kernel: raid6: neonx8 gen() 6635 MB/s Sep 6 00:08:39.875555 kernel: raid6: neonx4 gen() 6496 MB/s Sep 6 00:08:39.892539 kernel: raid6: neonx2 gen() 5428 MB/s Sep 6 00:08:39.909538 kernel: raid6: neonx1 gen() 3934 MB/s Sep 6 00:08:39.926539 kernel: raid6: int64x8 gen() 3776 MB/s Sep 6 00:08:39.943537 kernel: raid6: int64x4 gen() 3694 MB/s Sep 6 00:08:39.960538 kernel: raid6: int64x2 gen() 3585 MB/s Sep 6 00:08:39.978520 kernel: raid6: int64x1 gen() 2758 MB/s Sep 6 00:08:39.978560 kernel: raid6: using algorithm neonx8 gen() 6635 MB/s Sep 6 00:08:39.996498 kernel: raid6: .... xor() 4853 MB/s, rmw enabled Sep 6 00:08:39.996552 kernel: raid6: using neon recovery algorithm Sep 6 00:08:40.004544 kernel: xor: measuring software checksum speed Sep 6 00:08:40.005540 kernel: 8regs : 10258 MB/sec Sep 6 00:08:40.007931 kernel: 32regs : 11002 MB/sec Sep 6 00:08:40.007963 kernel: arm64_neon : 9286 MB/sec Sep 6 00:08:40.007989 kernel: xor: using function: 32regs (11002 MB/sec) Sep 6 00:08:40.093800 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 6 00:08:40.112139 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 6 00:08:40.122919 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 00:08:40.156540 systemd-udevd[471]: Using default interface naming scheme 'v255'. Sep 6 00:08:40.164403 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 00:08:40.194089 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 6 00:08:40.215732 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 6 00:08:40.271660 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 6 00:08:40.283934 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 6 00:08:40.401710 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 00:08:40.413806 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 6 00:08:40.464184 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 6 00:08:40.470684 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 6 00:08:40.470863 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 00:08:40.488451 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 6 00:08:40.498010 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 6 00:08:40.544749 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 6 00:08:40.591070 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 6 00:08:40.591134 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Sep 6 00:08:40.601315 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 6 00:08:40.601654 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 6 00:08:40.617552 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:3e:27:df:02:47 Sep 6 00:08:40.622441 (udev-worker)[530]: Network interface NamePolicy= disabled on kernel command line. Sep 6 00:08:40.634318 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 6 00:08:40.636702 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 00:08:40.647671 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 6 00:08:40.652891 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 6 00:08:40.655875 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 00:08:40.656077 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 00:08:40.673547 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 6 00:08:40.675586 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 6 00:08:40.676937 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 00:08:40.684665 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 6 00:08:40.705672 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 6 00:08:40.705753 kernel: GPT:9289727 != 16777215 Sep 6 00:08:40.705780 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 6 00:08:40.707521 kernel: GPT:9289727 != 16777215 Sep 6 00:08:40.708658 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 6 00:08:40.708695 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 6 00:08:40.713081 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 00:08:40.725023 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 6 00:08:40.756768 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 00:08:40.817566 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/nvme0n1p6 scanned by (udev-worker) (531) Sep 6 00:08:40.855586 kernel: BTRFS: device fsid 045c118e-b098-46f0-884a-43665575c70e devid 1 transid 37 /dev/nvme0n1p3 scanned by (udev-worker) (530) Sep 6 00:08:40.879916 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 6 00:08:40.918908 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 6 00:08:40.965226 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 6 00:08:40.979128 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 6 00:08:40.986068 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 6 00:08:41.005775 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 6 00:08:41.023566 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 6 00:08:41.024325 disk-uuid[661]: Primary Header is updated. Sep 6 00:08:41.024325 disk-uuid[661]: Secondary Entries is updated. Sep 6 00:08:41.024325 disk-uuid[661]: Secondary Header is updated. Sep 6 00:08:41.055535 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 6 00:08:41.063550 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 6 00:08:42.074621 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 6 00:08:42.076179 disk-uuid[662]: The operation has completed successfully. Sep 6 00:08:42.258201 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 6 00:08:42.260544 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 6 00:08:42.310764 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 6 00:08:42.321346 sh[1003]: Success Sep 6 00:08:42.347656 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 6 00:08:42.474647 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 6 00:08:42.491721 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 6 00:08:42.499629 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 6 00:08:42.529886 kernel: BTRFS info (device dm-0): first mount of filesystem 045c118e-b098-46f0-884a-43665575c70e Sep 6 00:08:42.529953 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 6 00:08:42.529980 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 6 00:08:42.531424 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 6 00:08:42.532625 kernel: BTRFS info (device dm-0): using free space tree Sep 6 00:08:42.643540 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 6 00:08:42.681445 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 6 00:08:42.681972 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 6 00:08:42.695913 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 6 00:08:42.698801 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 6 00:08:42.736535 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 6 00:08:42.736630 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 6 00:08:42.736664 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 6 00:08:42.752553 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 6 00:08:42.772450 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 6 00:08:42.776438 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 6 00:08:42.785492 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 6 00:08:42.797951 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 6 00:08:42.878570 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 6 00:08:42.890854 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 6 00:08:42.948281 systemd-networkd[1195]: lo: Link UP Sep 6 00:08:42.948304 systemd-networkd[1195]: lo: Gained carrier Sep 6 00:08:42.957487 systemd-networkd[1195]: Enumeration completed Sep 6 00:08:42.960969 systemd-networkd[1195]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 00:08:42.960976 systemd-networkd[1195]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 6 00:08:42.965104 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 6 00:08:42.967812 systemd[1]: Reached target network.target - Network. Sep 6 00:08:42.971367 systemd-networkd[1195]: eth0: Link UP Sep 6 00:08:42.971374 systemd-networkd[1195]: eth0: Gained carrier Sep 6 00:08:42.971391 systemd-networkd[1195]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 00:08:42.998587 systemd-networkd[1195]: eth0: DHCPv4 address 172.31.26.146/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 6 00:08:43.256913 ignition[1130]: Ignition 2.19.0 Sep 6 00:08:43.256941 ignition[1130]: Stage: fetch-offline Sep 6 00:08:43.260825 ignition[1130]: no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:43.260865 ignition[1130]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:43.262760 ignition[1130]: Ignition finished successfully Sep 6 00:08:43.268859 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 6 00:08:43.279925 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 6 00:08:43.304681 ignition[1205]: Ignition 2.19.0 Sep 6 00:08:43.304708 ignition[1205]: Stage: fetch Sep 6 00:08:43.306548 ignition[1205]: no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:43.306574 ignition[1205]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:43.307814 ignition[1205]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:43.322788 ignition[1205]: PUT result: OK Sep 6 00:08:43.326558 ignition[1205]: parsed url from cmdline: "" Sep 6 00:08:43.326574 ignition[1205]: no config URL provided Sep 6 00:08:43.326589 ignition[1205]: reading system config file "/usr/lib/ignition/user.ign" Sep 6 00:08:43.326614 ignition[1205]: no config at "/usr/lib/ignition/user.ign" Sep 6 00:08:43.326658 ignition[1205]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:43.328534 ignition[1205]: PUT result: OK Sep 6 00:08:43.328621 ignition[1205]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 6 00:08:43.336908 ignition[1205]: GET result: OK Sep 6 00:08:43.337086 ignition[1205]: parsing config with SHA512: 0244882ffcaec43b73c415bd6db274989fecc6ca4bbfaa2ebcac41a624b57d57037c954bca18fe861e5c375fea4dc82c7461acbcca4e9dae628847fe0fed66ce Sep 6 00:08:43.350171 unknown[1205]: fetched base config from "system" Sep 6 00:08:43.352279 unknown[1205]: fetched base config from "system" Sep 6 00:08:43.352499 unknown[1205]: fetched user config from "aws" Sep 6 00:08:43.353558 ignition[1205]: fetch: fetch complete Sep 6 00:08:43.353583 ignition[1205]: fetch: fetch passed Sep 6 00:08:43.353679 ignition[1205]: Ignition finished successfully Sep 6 00:08:43.363207 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 6 00:08:43.372862 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 6 00:08:43.402814 ignition[1211]: Ignition 2.19.0 Sep 6 00:08:43.403321 ignition[1211]: Stage: kargs Sep 6 00:08:43.403994 ignition[1211]: no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:43.404020 ignition[1211]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:43.404169 ignition[1211]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:43.418875 ignition[1211]: PUT result: OK Sep 6 00:08:43.427876 ignition[1211]: kargs: kargs passed Sep 6 00:08:43.427989 ignition[1211]: Ignition finished successfully Sep 6 00:08:43.433364 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 6 00:08:43.443817 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 6 00:08:43.471686 ignition[1218]: Ignition 2.19.0 Sep 6 00:08:43.471713 ignition[1218]: Stage: disks Sep 6 00:08:43.473488 ignition[1218]: no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:43.473534 ignition[1218]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:43.474217 ignition[1218]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:43.477390 ignition[1218]: PUT result: OK Sep 6 00:08:43.486561 ignition[1218]: disks: disks passed Sep 6 00:08:43.487257 ignition[1218]: Ignition finished successfully Sep 6 00:08:43.491958 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 6 00:08:43.497726 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 6 00:08:43.497886 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 6 00:08:43.505008 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 6 00:08:43.507184 systemd[1]: Reached target sysinit.target - System Initialization. Sep 6 00:08:43.509623 systemd[1]: Reached target basic.target - Basic System. Sep 6 00:08:43.521872 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 6 00:08:43.574546 systemd-fsck[1227]: ROOT: clean, 14/553520 files, 52654/553472 blocks Sep 6 00:08:43.581400 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 6 00:08:43.594749 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 6 00:08:43.688547 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 72e55cb0-8368-4871-a3a0-8637412e72e8 r/w with ordered data mode. Quota mode: none. Sep 6 00:08:43.689226 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 6 00:08:43.691871 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 6 00:08:43.708676 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 6 00:08:43.713730 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 6 00:08:43.718321 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 6 00:08:43.718403 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 6 00:08:43.718453 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 6 00:08:43.748545 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/nvme0n1p6 scanned by mount (1246) Sep 6 00:08:43.749061 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 6 00:08:43.753592 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 6 00:08:43.753635 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 6 00:08:43.755134 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 6 00:08:43.764819 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 6 00:08:43.775782 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 6 00:08:43.777788 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 6 00:08:44.169906 initrd-setup-root[1270]: cut: /sysroot/etc/passwd: No such file or directory Sep 6 00:08:44.188219 initrd-setup-root[1277]: cut: /sysroot/etc/group: No such file or directory Sep 6 00:08:44.198043 initrd-setup-root[1284]: cut: /sysroot/etc/shadow: No such file or directory Sep 6 00:08:44.207072 initrd-setup-root[1291]: cut: /sysroot/etc/gshadow: No such file or directory Sep 6 00:08:44.273649 systemd-networkd[1195]: eth0: Gained IPv6LL Sep 6 00:08:44.652828 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 6 00:08:44.664719 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 6 00:08:44.669151 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 6 00:08:44.691024 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 6 00:08:44.693146 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 6 00:08:44.734649 ignition[1359]: INFO : Ignition 2.19.0 Sep 6 00:08:44.737931 ignition[1359]: INFO : Stage: mount Sep 6 00:08:44.737931 ignition[1359]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:44.737931 ignition[1359]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:44.737931 ignition[1359]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:44.750551 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 6 00:08:44.755879 ignition[1359]: INFO : PUT result: OK Sep 6 00:08:44.760872 ignition[1359]: INFO : mount: mount passed Sep 6 00:08:44.762634 ignition[1359]: INFO : Ignition finished successfully Sep 6 00:08:44.767195 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 6 00:08:44.779858 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 6 00:08:44.795857 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 6 00:08:44.822928 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 scanned by mount (1370) Sep 6 00:08:44.822991 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 6 00:08:44.826377 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Sep 6 00:08:44.826417 kernel: BTRFS info (device nvme0n1p6): using free space tree Sep 6 00:08:44.832549 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 6 00:08:44.835718 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 6 00:08:44.874791 ignition[1387]: INFO : Ignition 2.19.0 Sep 6 00:08:44.876809 ignition[1387]: INFO : Stage: files Sep 6 00:08:44.876809 ignition[1387]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:44.876809 ignition[1387]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:44.876809 ignition[1387]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:44.886440 ignition[1387]: INFO : PUT result: OK Sep 6 00:08:44.891745 ignition[1387]: DEBUG : files: compiled without relabeling support, skipping Sep 6 00:08:44.894415 ignition[1387]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 6 00:08:44.894415 ignition[1387]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 6 00:08:44.927521 ignition[1387]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 6 00:08:44.930686 ignition[1387]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 6 00:08:44.934140 unknown[1387]: wrote ssh authorized keys file for user: core Sep 6 00:08:44.937618 ignition[1387]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 6 00:08:44.940612 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 6 00:08:44.940612 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 6 00:08:45.010586 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 6 00:08:45.223572 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 6 00:08:45.264884 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 6 00:08:45.264884 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 6 00:08:45.264884 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 6 00:08:45.264884 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 6 00:08:45.706418 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 6 00:08:46.072626 ignition[1387]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 6 00:08:46.072626 ignition[1387]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 6 00:08:46.081108 ignition[1387]: INFO : files: files passed Sep 6 00:08:46.081108 ignition[1387]: INFO : Ignition finished successfully Sep 6 00:08:46.111581 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 6 00:08:46.123830 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 6 00:08:46.133756 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 6 00:08:46.153210 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 6 00:08:46.153416 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 6 00:08:46.174996 initrd-setup-root-after-ignition[1415]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 6 00:08:46.174996 initrd-setup-root-after-ignition[1415]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 6 00:08:46.182018 initrd-setup-root-after-ignition[1419]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 6 00:08:46.187252 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 6 00:08:46.193658 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 6 00:08:46.204785 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 6 00:08:46.264285 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 6 00:08:46.264633 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 6 00:08:46.271611 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 6 00:08:46.274041 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 6 00:08:46.278954 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 6 00:08:46.298848 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 6 00:08:46.325002 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 6 00:08:46.335979 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 6 00:08:46.364896 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 6 00:08:46.371009 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 00:08:46.374097 systemd[1]: Stopped target timers.target - Timer Units. Sep 6 00:08:46.378776 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 6 00:08:46.379016 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 6 00:08:46.388778 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 6 00:08:46.394737 systemd[1]: Stopped target basic.target - Basic System. Sep 6 00:08:46.397157 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 6 00:08:46.401573 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 6 00:08:46.404268 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 6 00:08:46.407053 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 6 00:08:46.410675 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 6 00:08:46.419821 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 6 00:08:46.428649 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 6 00:08:46.431297 systemd[1]: Stopped target swap.target - Swaps. Sep 6 00:08:46.436766 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 6 00:08:46.436997 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 6 00:08:46.445590 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 6 00:08:46.446231 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 00:08:46.446528 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 6 00:08:46.450363 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 00:08:46.458738 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 6 00:08:46.459249 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 6 00:08:46.462454 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 6 00:08:46.462697 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 6 00:08:46.462970 systemd[1]: ignition-files.service: Deactivated successfully. Sep 6 00:08:46.463163 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 6 00:08:46.482658 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 6 00:08:46.487858 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 6 00:08:46.498188 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 6 00:08:46.502945 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 00:08:46.508731 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 6 00:08:46.511720 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 6 00:08:46.529612 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 6 00:08:46.530352 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 6 00:08:46.546570 ignition[1439]: INFO : Ignition 2.19.0 Sep 6 00:08:46.546570 ignition[1439]: INFO : Stage: umount Sep 6 00:08:46.550900 ignition[1439]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 00:08:46.550900 ignition[1439]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 6 00:08:46.550900 ignition[1439]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 6 00:08:46.560321 ignition[1439]: INFO : PUT result: OK Sep 6 00:08:46.563243 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 6 00:08:46.568669 ignition[1439]: INFO : umount: umount passed Sep 6 00:08:46.568669 ignition[1439]: INFO : Ignition finished successfully Sep 6 00:08:46.572052 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 6 00:08:46.572262 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 6 00:08:46.580393 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 6 00:08:46.580776 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 6 00:08:46.586863 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 6 00:08:46.587034 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 6 00:08:46.592188 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 6 00:08:46.592308 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 6 00:08:46.606101 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 6 00:08:46.606210 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 6 00:08:46.610239 systemd[1]: Stopped target network.target - Network. Sep 6 00:08:46.615926 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 6 00:08:46.616023 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 6 00:08:46.618609 systemd[1]: Stopped target paths.target - Path Units. Sep 6 00:08:46.620761 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 6 00:08:46.624108 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 00:08:46.627120 systemd[1]: Stopped target slices.target - Slice Units. Sep 6 00:08:46.627914 systemd[1]: Stopped target sockets.target - Socket Units. Sep 6 00:08:46.628346 systemd[1]: iscsid.socket: Deactivated successfully. Sep 6 00:08:46.628428 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 6 00:08:46.629048 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 6 00:08:46.629116 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 6 00:08:46.629384 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 6 00:08:46.629466 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 6 00:08:46.630105 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 6 00:08:46.630182 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 6 00:08:46.630452 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 6 00:08:46.630549 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 6 00:08:46.631139 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 6 00:08:46.631663 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 6 00:08:46.651575 systemd-networkd[1195]: eth0: DHCPv6 lease lost Sep 6 00:08:46.657133 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 6 00:08:46.657366 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 6 00:08:46.671579 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 6 00:08:46.671887 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 6 00:08:46.679790 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 6 00:08:46.679921 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 6 00:08:46.713974 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 6 00:08:46.720010 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 6 00:08:46.720136 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 6 00:08:46.723455 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 6 00:08:46.723814 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 6 00:08:46.735359 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 6 00:08:46.735457 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 6 00:08:46.737810 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 6 00:08:46.737890 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 00:08:46.741339 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 00:08:46.775829 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 6 00:08:46.777585 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 00:08:46.785670 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 6 00:08:46.785783 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 6 00:08:46.792631 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 6 00:08:46.792720 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 00:08:46.795015 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 6 00:08:46.795105 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 6 00:08:46.798005 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 6 00:08:46.798086 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 6 00:08:46.813240 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 6 00:08:46.813339 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 00:08:46.826788 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 6 00:08:46.829416 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 6 00:08:46.829564 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 00:08:46.839651 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 6 00:08:46.839760 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 00:08:46.843767 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 6 00:08:46.843857 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 00:08:46.846852 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 6 00:08:46.846935 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 00:08:46.850919 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 6 00:08:46.851093 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 6 00:08:46.893289 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 6 00:08:46.893695 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 6 00:08:46.901639 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 6 00:08:46.911017 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 6 00:08:46.939703 systemd[1]: Switching root. Sep 6 00:08:46.986712 systemd-journald[251]: Journal stopped Sep 6 00:08:49.812813 systemd-journald[251]: Received SIGTERM from PID 1 (systemd). Sep 6 00:08:49.812955 kernel: SELinux: policy capability network_peer_controls=1 Sep 6 00:08:49.813006 kernel: SELinux: policy capability open_perms=1 Sep 6 00:08:49.813038 kernel: SELinux: policy capability extended_socket_class=1 Sep 6 00:08:49.813069 kernel: SELinux: policy capability always_check_network=0 Sep 6 00:08:49.813098 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 6 00:08:49.813129 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 6 00:08:49.813160 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 6 00:08:49.813190 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 6 00:08:49.813225 kernel: audit: type=1403 audit(1757117327.565:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 6 00:08:49.813265 systemd[1]: Successfully loaded SELinux policy in 62.022ms. Sep 6 00:08:49.813310 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.873ms. Sep 6 00:08:49.813344 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 6 00:08:49.813376 systemd[1]: Detected virtualization amazon. Sep 6 00:08:49.813407 systemd[1]: Detected architecture arm64. Sep 6 00:08:49.813438 systemd[1]: Detected first boot. Sep 6 00:08:49.813471 systemd[1]: Initializing machine ID from VM UUID. Sep 6 00:08:49.815603 zram_generator::config[1481]: No configuration found. Sep 6 00:08:49.815676 systemd[1]: Populated /etc with preset unit settings. Sep 6 00:08:49.815712 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 6 00:08:49.815745 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 6 00:08:49.815778 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 6 00:08:49.815812 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 6 00:08:49.815844 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 6 00:08:49.815878 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 6 00:08:49.815925 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 6 00:08:49.815957 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 6 00:08:49.815988 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 6 00:08:49.816023 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 6 00:08:49.816052 systemd[1]: Created slice user.slice - User and Session Slice. Sep 6 00:08:49.816082 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 00:08:49.816112 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 00:08:49.816144 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 6 00:08:49.816195 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 6 00:08:49.816240 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 6 00:08:49.816275 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 6 00:08:49.816308 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 6 00:08:49.816340 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 00:08:49.816373 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 6 00:08:49.816404 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 6 00:08:49.816434 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 6 00:08:49.816464 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 6 00:08:49.816535 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 00:08:49.816573 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 6 00:08:49.816606 systemd[1]: Reached target slices.target - Slice Units. Sep 6 00:08:49.816638 systemd[1]: Reached target swap.target - Swaps. Sep 6 00:08:49.816670 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 6 00:08:49.816699 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 6 00:08:49.816731 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 6 00:08:49.816760 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 6 00:08:49.816791 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 00:08:49.816826 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 6 00:08:49.816858 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 6 00:08:49.816891 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 6 00:08:49.816922 systemd[1]: Mounting media.mount - External Media Directory... Sep 6 00:08:49.816952 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 6 00:08:49.816981 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 6 00:08:49.817017 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 6 00:08:49.817048 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 6 00:08:49.817081 systemd[1]: Reached target machines.target - Containers. Sep 6 00:08:49.817115 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 6 00:08:49.817145 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 00:08:49.817177 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 6 00:08:49.817208 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 6 00:08:49.817237 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 6 00:08:49.817267 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 6 00:08:49.817297 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 6 00:08:49.817326 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 6 00:08:49.817363 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 6 00:08:49.817393 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 6 00:08:49.817423 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 6 00:08:49.817452 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 6 00:08:49.817482 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 6 00:08:49.825684 systemd[1]: Stopped systemd-fsck-usr.service. Sep 6 00:08:49.825744 kernel: fuse: init (API version 7.39) Sep 6 00:08:49.825777 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 6 00:08:49.825809 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 6 00:08:49.825848 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 6 00:08:49.825877 kernel: loop: module loaded Sep 6 00:08:49.825906 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 6 00:08:49.825935 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 6 00:08:49.825966 systemd[1]: verity-setup.service: Deactivated successfully. Sep 6 00:08:49.825996 systemd[1]: Stopped verity-setup.service. Sep 6 00:08:49.826027 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 6 00:08:49.826056 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 6 00:08:49.826086 systemd[1]: Mounted media.mount - External Media Directory. Sep 6 00:08:49.826119 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 6 00:08:49.826150 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 6 00:08:49.826180 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 6 00:08:49.826209 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 00:08:49.826238 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 6 00:08:49.826270 kernel: ACPI: bus type drm_connector registered Sep 6 00:08:49.826299 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 6 00:08:49.826328 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 6 00:08:49.826357 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 6 00:08:49.826387 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 6 00:08:49.826418 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 6 00:08:49.826448 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 6 00:08:49.826480 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 6 00:08:49.833430 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 6 00:08:49.833487 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 6 00:08:49.833580 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 6 00:08:49.833615 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 6 00:08:49.833646 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 6 00:08:49.833677 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 6 00:08:49.833711 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 6 00:08:49.833785 systemd-journald[1556]: Collecting audit messages is disabled. Sep 6 00:08:49.833896 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 6 00:08:49.833932 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 6 00:08:49.833963 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 6 00:08:49.833994 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 6 00:08:49.834024 systemd-journald[1556]: Journal started Sep 6 00:08:49.836480 systemd-journald[1556]: Runtime Journal (/run/log/journal/ec2292502920437d3078a6b59afc3845) is 8.0M, max 75.3M, 67.3M free. Sep 6 00:08:49.836605 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 6 00:08:48.982943 systemd[1]: Queued start job for default target multi-user.target. Sep 6 00:08:49.167345 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 6 00:08:49.168160 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 6 00:08:49.855602 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 6 00:08:49.870468 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 6 00:08:49.870627 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 6 00:08:49.876543 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 00:08:49.901918 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 6 00:08:49.906545 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 6 00:08:49.922089 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 6 00:08:49.922181 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 6 00:08:49.936783 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 6 00:08:49.968481 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 6 00:08:49.981121 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 6 00:08:49.997464 systemd[1]: Started systemd-journald.service - Journal Service. Sep 6 00:08:49.992878 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 6 00:08:49.997213 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 6 00:08:50.006875 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 6 00:08:50.015908 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 6 00:08:50.033584 kernel: loop0: detected capacity change from 0 to 52536 Sep 6 00:08:50.055596 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 6 00:08:50.081665 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 6 00:08:50.091999 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 6 00:08:50.105827 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 6 00:08:50.121030 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 6 00:08:50.132288 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 6 00:08:50.141115 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 00:08:50.157078 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 6 00:08:50.167772 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 6 00:08:50.167804 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 6 00:08:50.178292 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 00:08:50.188596 kernel: loop1: detected capacity change from 0 to 114432 Sep 6 00:08:50.203656 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 6 00:08:50.214474 systemd-journald[1556]: Time spent on flushing to /var/log/journal/ec2292502920437d3078a6b59afc3845 is 55.220ms for 921 entries. Sep 6 00:08:50.214474 systemd-journald[1556]: System Journal (/var/log/journal/ec2292502920437d3078a6b59afc3845) is 8.0M, max 195.6M, 187.6M free. Sep 6 00:08:50.285364 systemd-journald[1556]: Received client request to flush runtime journal. Sep 6 00:08:50.224404 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 6 00:08:50.232685 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 6 00:08:50.267344 udevadm[1624]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 6 00:08:50.292304 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 6 00:08:50.322200 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 6 00:08:50.327606 kernel: loop2: detected capacity change from 0 to 114328 Sep 6 00:08:50.335865 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 6 00:08:50.388293 systemd-tmpfiles[1633]: ACLs are not supported, ignoring. Sep 6 00:08:50.388326 systemd-tmpfiles[1633]: ACLs are not supported, ignoring. Sep 6 00:08:50.400654 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 00:08:50.446585 kernel: loop3: detected capacity change from 0 to 207008 Sep 6 00:08:50.497745 kernel: loop4: detected capacity change from 0 to 52536 Sep 6 00:08:50.522303 kernel: loop5: detected capacity change from 0 to 114432 Sep 6 00:08:50.550983 kernel: loop6: detected capacity change from 0 to 114328 Sep 6 00:08:50.567573 kernel: loop7: detected capacity change from 0 to 207008 Sep 6 00:08:50.600470 (sd-merge)[1638]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 6 00:08:50.602393 (sd-merge)[1638]: Merged extensions into '/usr'. Sep 6 00:08:50.614941 systemd[1]: Reloading requested from client PID 1592 ('systemd-sysext') (unit systemd-sysext.service)... Sep 6 00:08:50.615098 systemd[1]: Reloading... Sep 6 00:08:50.809564 zram_generator::config[1670]: No configuration found. Sep 6 00:08:51.053104 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 6 00:08:51.167326 systemd[1]: Reloading finished in 551 ms. Sep 6 00:08:51.214390 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 6 00:08:51.230817 systemd[1]: Starting ensure-sysext.service... Sep 6 00:08:51.244696 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 6 00:08:51.279793 systemd[1]: Reloading requested from client PID 1715 ('systemctl') (unit ensure-sysext.service)... Sep 6 00:08:51.279831 systemd[1]: Reloading... Sep 6 00:08:51.345438 systemd-tmpfiles[1716]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 6 00:08:51.346176 systemd-tmpfiles[1716]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 6 00:08:51.351902 systemd-tmpfiles[1716]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 6 00:08:51.352714 systemd-tmpfiles[1716]: ACLs are not supported, ignoring. Sep 6 00:08:51.356766 systemd-tmpfiles[1716]: ACLs are not supported, ignoring. Sep 6 00:08:51.385550 zram_generator::config[1741]: No configuration found. Sep 6 00:08:51.391309 systemd-tmpfiles[1716]: Detected autofs mount point /boot during canonicalization of boot. Sep 6 00:08:51.391331 systemd-tmpfiles[1716]: Skipping /boot Sep 6 00:08:51.454956 systemd-tmpfiles[1716]: Detected autofs mount point /boot during canonicalization of boot. Sep 6 00:08:51.455576 systemd-tmpfiles[1716]: Skipping /boot Sep 6 00:08:51.725111 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 6 00:08:51.836101 systemd[1]: Reloading finished in 555 ms. Sep 6 00:08:51.849942 ldconfig[1581]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 6 00:08:51.861568 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 6 00:08:51.864684 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 6 00:08:51.872489 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 00:08:51.901869 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 6 00:08:51.909812 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 6 00:08:51.923154 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 6 00:08:51.939853 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 6 00:08:51.951252 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 00:08:51.957881 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 6 00:08:51.980585 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 6 00:08:51.993583 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 00:08:52.002989 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 6 00:08:52.010017 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 6 00:08:52.017085 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 6 00:08:52.020123 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 00:08:52.026460 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 00:08:52.028978 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 00:08:52.040685 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 00:08:52.043859 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 6 00:08:52.048603 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 00:08:52.048989 systemd[1]: Reached target time-set.target - System Time Set. Sep 6 00:08:52.065585 systemd[1]: Finished ensure-sysext.service. Sep 6 00:08:52.086350 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 6 00:08:52.100295 augenrules[1828]: No rules Sep 6 00:08:52.115651 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 6 00:08:52.131554 systemd-udevd[1811]: Using default interface naming scheme 'v255'. Sep 6 00:08:52.132244 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 6 00:08:52.132771 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 6 00:08:52.135864 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 6 00:08:52.136187 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 6 00:08:52.156280 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 6 00:08:52.157713 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 6 00:08:52.160590 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 6 00:08:52.169192 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 6 00:08:52.171196 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 6 00:08:52.174746 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 6 00:08:52.188358 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 6 00:08:52.200949 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 6 00:08:52.219594 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 6 00:08:52.222888 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 6 00:08:52.227274 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 00:08:52.251386 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 6 00:08:52.254187 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 6 00:08:52.257691 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 6 00:08:52.416917 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 6 00:08:52.475180 systemd-networkd[1845]: lo: Link UP Sep 6 00:08:52.475200 systemd-networkd[1845]: lo: Gained carrier Sep 6 00:08:52.476992 systemd-networkd[1845]: Enumeration completed Sep 6 00:08:52.477187 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 6 00:08:52.484783 (udev-worker)[1865]: Network interface NamePolicy= disabled on kernel command line. Sep 6 00:08:52.492871 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 6 00:08:52.553693 systemd-resolved[1810]: Positive Trust Anchors: Sep 6 00:08:52.554216 systemd-resolved[1810]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 6 00:08:52.554293 systemd-resolved[1810]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 6 00:08:52.572313 systemd-resolved[1810]: Defaulting to hostname 'linux'. Sep 6 00:08:52.575778 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 6 00:08:52.578364 systemd[1]: Reached target network.target - Network. Sep 6 00:08:52.580546 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 6 00:08:52.636871 systemd-networkd[1845]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 00:08:52.636897 systemd-networkd[1845]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 6 00:08:52.643923 systemd-networkd[1845]: eth0: Link UP Sep 6 00:08:52.644315 systemd-networkd[1845]: eth0: Gained carrier Sep 6 00:08:52.644359 systemd-networkd[1845]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 00:08:52.656633 systemd-networkd[1845]: eth0: DHCPv4 address 172.31.26.146/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 6 00:08:52.667587 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (1842) Sep 6 00:08:52.853377 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 00:08:52.939935 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 6 00:08:52.948839 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 6 00:08:52.967449 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 6 00:08:52.983746 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 6 00:08:53.005231 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 00:08:53.013582 lvm[1969]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 6 00:08:53.019599 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 6 00:08:53.054099 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 6 00:08:53.057298 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 6 00:08:53.059659 systemd[1]: Reached target sysinit.target - System Initialization. Sep 6 00:08:53.062165 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 6 00:08:53.064949 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 6 00:08:53.068004 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 6 00:08:53.070561 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 6 00:08:53.073275 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 6 00:08:53.075933 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 6 00:08:53.075979 systemd[1]: Reached target paths.target - Path Units. Sep 6 00:08:53.077999 systemd[1]: Reached target timers.target - Timer Units. Sep 6 00:08:53.081785 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 6 00:08:53.086827 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 6 00:08:53.099867 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 6 00:08:53.104724 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 6 00:08:53.108177 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 6 00:08:53.110864 systemd[1]: Reached target sockets.target - Socket Units. Sep 6 00:08:53.113083 systemd[1]: Reached target basic.target - Basic System. Sep 6 00:08:53.116780 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 6 00:08:53.116847 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 6 00:08:53.131150 systemd[1]: Starting containerd.service - containerd container runtime... Sep 6 00:08:53.141314 lvm[1977]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 6 00:08:53.145756 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 6 00:08:53.160865 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 6 00:08:53.173756 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 6 00:08:53.190915 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 6 00:08:53.193201 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 6 00:08:53.211826 jq[1981]: false Sep 6 00:08:53.198648 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 6 00:08:53.213331 systemd[1]: Started ntpd.service - Network Time Service. Sep 6 00:08:53.228715 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 6 00:08:53.238759 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 6 00:08:53.243881 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 6 00:08:53.253892 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 6 00:08:53.265866 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 6 00:08:53.271894 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 6 00:08:53.272952 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 6 00:08:53.275744 systemd[1]: Starting update-engine.service - Update Engine... Sep 6 00:08:53.288637 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 6 00:08:53.294998 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 6 00:08:53.301274 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 6 00:08:53.303655 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 6 00:08:53.347686 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 6 00:08:53.348038 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 6 00:08:53.358770 update_engine[1991]: I20250906 00:08:53.358076 1991 main.cc:92] Flatcar Update Engine starting Sep 6 00:08:53.388180 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 6 00:08:53.387882 dbus-daemon[1980]: [system] SELinux support is enabled Sep 6 00:08:53.395153 extend-filesystems[1982]: Found loop4 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found loop5 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found loop6 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found loop7 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found nvme0n1 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found nvme0n1p1 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found nvme0n1p2 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found nvme0n1p3 Sep 6 00:08:53.395153 extend-filesystems[1982]: Found usr Sep 6 00:08:53.395153 extend-filesystems[1982]: Found nvme0n1p4 Sep 6 00:08:53.453999 jq[1993]: true Sep 6 00:08:53.398297 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 6 00:08:53.455703 extend-filesystems[1982]: Found nvme0n1p6 Sep 6 00:08:53.455703 extend-filesystems[1982]: Found nvme0n1p7 Sep 6 00:08:53.455703 extend-filesystems[1982]: Found nvme0n1p9 Sep 6 00:08:53.455703 extend-filesystems[1982]: Checking size of /dev/nvme0n1p9 Sep 6 00:08:53.430669 dbus-daemon[1980]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1845 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 6 00:08:53.398366 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 6 00:08:53.506339 update_engine[1991]: I20250906 00:08:53.460795 1991 update_check_scheduler.cc:74] Next update check in 11m39s Sep 6 00:08:53.506395 tar[2000]: linux-arm64/LICENSE Sep 6 00:08:53.506395 tar[2000]: linux-arm64/helm Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: ntpd 4.2.8p17@1.4004-o Fri Sep 5 21:57:21 UTC 2025 (1): Starting Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: ---------------------------------------------------- Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: ntp-4 is maintained by Network Time Foundation, Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: corporation. Support and training for ntp-4 are Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: available at https://www.nwtime.org/support Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: ---------------------------------------------------- Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: proto: precision = 0.096 usec (-23) Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: basedate set to 2025-08-24 Sep 6 00:08:53.506915 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: gps base set to 2025-08-24 (week 2381) Sep 6 00:08:53.468655 ntpd[1984]: ntpd 4.2.8p17@1.4004-o Fri Sep 5 21:57:21 UTC 2025 (1): Starting Sep 6 00:08:53.408581 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listen and drop on 0 v6wildcard [::]:123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listen normally on 2 lo 127.0.0.1:123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listen normally on 3 eth0 172.31.26.146:123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listen normally on 4 lo [::1]:123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: bind(21) AF_INET6 fe80::43e:27ff:fedf:247%2#123 flags 0x11 failed: Cannot assign requested address Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: unable to create socket on eth0 (5) for fe80::43e:27ff:fedf:247%2#123 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: failed to init interface for address fe80::43e:27ff:fedf:247%2 Sep 6 00:08:53.523778 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: Listening on routing socket on fd #21 for interface updates Sep 6 00:08:53.468705 ntpd[1984]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 6 00:08:53.408621 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 6 00:08:53.468726 ntpd[1984]: ---------------------------------------------------- Sep 6 00:08:53.480142 (ntainerd)[2004]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 6 00:08:53.468744 ntpd[1984]: ntp-4 is maintained by Network Time Foundation, Sep 6 00:08:53.487640 systemd[1]: Started update-engine.service - Update Engine. Sep 6 00:08:53.548560 extend-filesystems[1982]: Resized partition /dev/nvme0n1p9 Sep 6 00:08:53.468763 ntpd[1984]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 6 00:08:53.505798 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 6 00:08:53.468780 ntpd[1984]: corporation. Support and training for ntp-4 are Sep 6 00:08:53.514862 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 6 00:08:53.468799 ntpd[1984]: available at https://www.nwtime.org/support Sep 6 00:08:53.518084 systemd[1]: motdgen.service: Deactivated successfully. Sep 6 00:08:53.468818 ntpd[1984]: ---------------------------------------------------- Sep 6 00:08:53.519421 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 6 00:08:53.482831 ntpd[1984]: proto: precision = 0.096 usec (-23) Sep 6 00:08:53.483233 ntpd[1984]: basedate set to 2025-08-24 Sep 6 00:08:53.483259 ntpd[1984]: gps base set to 2025-08-24 (week 2381) Sep 6 00:08:53.514034 ntpd[1984]: Listen and drop on 0 v6wildcard [::]:123 Sep 6 00:08:53.514125 ntpd[1984]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 6 00:08:53.514434 ntpd[1984]: Listen normally on 2 lo 127.0.0.1:123 Sep 6 00:08:53.514538 ntpd[1984]: Listen normally on 3 eth0 172.31.26.146:123 Sep 6 00:08:53.514627 ntpd[1984]: Listen normally on 4 lo [::1]:123 Sep 6 00:08:53.514724 ntpd[1984]: bind(21) AF_INET6 fe80::43e:27ff:fedf:247%2#123 flags 0x11 failed: Cannot assign requested address Sep 6 00:08:53.514765 ntpd[1984]: unable to create socket on eth0 (5) for fe80::43e:27ff:fedf:247%2#123 Sep 6 00:08:53.514793 ntpd[1984]: failed to init interface for address fe80::43e:27ff:fedf:247%2 Sep 6 00:08:53.514849 ntpd[1984]: Listening on routing socket on fd #21 for interface updates Sep 6 00:08:53.565442 extend-filesystems[2029]: resize2fs 1.47.1 (20-May-2024) Sep 6 00:08:53.577885 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 6 00:08:53.586897 jq[2017]: true Sep 6 00:08:53.589015 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 6 00:08:53.589015 ntpd[1984]: 6 Sep 00:08:53 ntpd[1984]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 6 00:08:53.583087 ntpd[1984]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 6 00:08:53.583139 ntpd[1984]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 6 00:08:53.678003 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 6 00:08:53.673112 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 6 00:08:53.692537 extend-filesystems[2029]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 6 00:08:53.692537 extend-filesystems[2029]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 6 00:08:53.692537 extend-filesystems[2029]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 6 00:08:53.704241 extend-filesystems[1982]: Resized filesystem in /dev/nvme0n1p9 Sep 6 00:08:53.715229 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 6 00:08:53.715656 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 6 00:08:53.746879 systemd-logind[1990]: Watching system buttons on /dev/input/event0 (Power Button) Sep 6 00:08:53.746932 systemd-logind[1990]: Watching system buttons on /dev/input/event1 (Sleep Button) Sep 6 00:08:53.748020 systemd-logind[1990]: New seat seat0. Sep 6 00:08:53.754531 systemd[1]: Started systemd-logind.service - User Login Management. Sep 6 00:08:53.773096 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (1862) Sep 6 00:08:53.807752 systemd-networkd[1845]: eth0: Gained IPv6LL Sep 6 00:08:53.815793 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 6 00:08:53.819291 systemd[1]: Reached target network-online.target - Network is Online. Sep 6 00:08:53.836186 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 6 00:08:53.844733 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:08:53.850184 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 6 00:08:53.936412 coreos-metadata[1979]: Sep 06 00:08:53.933 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 6 00:08:53.950793 coreos-metadata[1979]: Sep 06 00:08:53.947 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 6 00:08:53.951953 coreos-metadata[1979]: Sep 06 00:08:53.951 INFO Fetch successful Sep 6 00:08:53.951953 coreos-metadata[1979]: Sep 06 00:08:53.951 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 6 00:08:53.954471 bash[2067]: Updated "/home/core/.ssh/authorized_keys" Sep 6 00:08:53.956528 coreos-metadata[1979]: Sep 06 00:08:53.955 INFO Fetch successful Sep 6 00:08:53.956528 coreos-metadata[1979]: Sep 06 00:08:53.955 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 6 00:08:53.968599 coreos-metadata[1979]: Sep 06 00:08:53.962 INFO Fetch successful Sep 6 00:08:53.968599 coreos-metadata[1979]: Sep 06 00:08:53.962 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 6 00:08:53.968599 coreos-metadata[1979]: Sep 06 00:08:53.963 INFO Fetch successful Sep 6 00:08:53.968599 coreos-metadata[1979]: Sep 06 00:08:53.963 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 6 00:08:53.973595 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 6 00:08:53.978698 coreos-metadata[1979]: Sep 06 00:08:53.976 INFO Fetch failed with 404: resource not found Sep 6 00:08:53.978698 coreos-metadata[1979]: Sep 06 00:08:53.978 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 6 00:08:53.978698 coreos-metadata[1979]: Sep 06 00:08:53.978 INFO Fetch successful Sep 6 00:08:53.993873 coreos-metadata[1979]: Sep 06 00:08:53.986 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 6 00:08:53.993873 coreos-metadata[1979]: Sep 06 00:08:53.986 INFO Fetch successful Sep 6 00:08:53.993873 coreos-metadata[1979]: Sep 06 00:08:53.986 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 6 00:08:53.993873 coreos-metadata[1979]: Sep 06 00:08:53.986 INFO Fetch successful Sep 6 00:08:53.993873 coreos-metadata[1979]: Sep 06 00:08:53.986 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 6 00:08:53.987843 systemd[1]: Starting sshkeys.service... Sep 6 00:08:53.994373 coreos-metadata[1979]: Sep 06 00:08:53.994 INFO Fetch successful Sep 6 00:08:53.994373 coreos-metadata[1979]: Sep 06 00:08:53.994 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 6 00:08:54.001318 coreos-metadata[1979]: Sep 06 00:08:53.998 INFO Fetch successful Sep 6 00:08:54.099673 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 6 00:08:54.111794 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 6 00:08:54.119623 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: Initializing new seelog logger Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: New Seelog Logger Creation Complete Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 processing appconfig overrides Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 processing appconfig overrides Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.131544 amazon-ssm-agent[2060]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.132536 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 processing appconfig overrides Sep 6 00:08:54.133672 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO Proxy environment variables: Sep 6 00:08:54.137788 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.138012 amazon-ssm-agent[2060]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 6 00:08:54.138258 amazon-ssm-agent[2060]: 2025/09/06 00:08:54 processing appconfig overrides Sep 6 00:08:54.177539 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 6 00:08:54.209629 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 6 00:08:54.219626 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 6 00:08:54.235624 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO http_proxy: Sep 6 00:08:54.341319 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO no_proxy: Sep 6 00:08:54.350472 locksmithd[2023]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 6 00:08:54.446394 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO https_proxy: Sep 6 00:08:54.451447 dbus-daemon[1980]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 6 00:08:54.451726 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 6 00:08:54.463268 dbus-daemon[1980]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=2021 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 6 00:08:54.489294 coreos-metadata[2123]: Sep 06 00:08:54.489 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 6 00:08:54.526887 coreos-metadata[2123]: Sep 06 00:08:54.490 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 6 00:08:54.526887 coreos-metadata[2123]: Sep 06 00:08:54.492 INFO Fetch successful Sep 6 00:08:54.526887 coreos-metadata[2123]: Sep 06 00:08:54.492 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 6 00:08:54.526887 coreos-metadata[2123]: Sep 06 00:08:54.494 INFO Fetch successful Sep 6 00:08:54.502891 unknown[2123]: wrote ssh authorized keys file for user: core Sep 6 00:08:54.515317 systemd[1]: Starting polkit.service - Authorization Manager... Sep 6 00:08:54.545567 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO Checking if agent identity type OnPrem can be assumed Sep 6 00:08:54.590595 polkitd[2162]: Started polkitd version 121 Sep 6 00:08:54.635847 update-ssh-keys[2175]: Updated "/home/core/.ssh/authorized_keys" Sep 6 00:08:54.639602 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 6 00:08:54.642487 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO Checking if agent identity type EC2 can be assumed Sep 6 00:08:54.647312 polkitd[2162]: Loading rules from directory /etc/polkit-1/rules.d Sep 6 00:08:54.647429 polkitd[2162]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 6 00:08:54.652260 systemd[1]: Finished sshkeys.service. Sep 6 00:08:54.659229 polkitd[2162]: Finished loading, compiling and executing 2 rules Sep 6 00:08:54.674960 dbus-daemon[1980]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 6 00:08:54.676205 systemd[1]: Started polkit.service - Authorization Manager. Sep 6 00:08:54.680333 polkitd[2162]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 6 00:08:54.746672 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO Agent will take identity from EC2 Sep 6 00:08:54.758267 systemd-resolved[1810]: System hostname changed to 'ip-172-31-26-146'. Sep 6 00:08:54.758273 systemd-hostnamed[2021]: Hostname set to (transient) Sep 6 00:08:54.848558 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 6 00:08:54.853549 containerd[2004]: time="2025-09-06T00:08:54.850010617Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 6 00:08:54.945695 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 6 00:08:55.022848 containerd[2004]: time="2025-09-06T00:08:55.022702210Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.036025 containerd[2004]: time="2025-09-06T00:08:55.035596678Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.103-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 6 00:08:55.036165 containerd[2004]: time="2025-09-06T00:08:55.036023770Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 6 00:08:55.036165 containerd[2004]: time="2025-09-06T00:08:55.036082066Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 6 00:08:55.036474 containerd[2004]: time="2025-09-06T00:08:55.036429574Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 6 00:08:55.036709 containerd[2004]: time="2025-09-06T00:08:55.036490150Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.036709 containerd[2004]: time="2025-09-06T00:08:55.036681994Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 6 00:08:55.036802 containerd[2004]: time="2025-09-06T00:08:55.036717094Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.037146 containerd[2004]: time="2025-09-06T00:08:55.037079074Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 6 00:08:55.037206 containerd[2004]: time="2025-09-06T00:08:55.037141894Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.037251 containerd[2004]: time="2025-09-06T00:08:55.037193686Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 6 00:08:55.037251 containerd[2004]: time="2025-09-06T00:08:55.037229458Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.037956 containerd[2004]: time="2025-09-06T00:08:55.037401790Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.046545 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] using named pipe channel for IPC Sep 6 00:08:55.047364 containerd[2004]: time="2025-09-06T00:08:55.047269246Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 6 00:08:55.052045 containerd[2004]: time="2025-09-06T00:08:55.051971650Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 6 00:08:55.052045 containerd[2004]: time="2025-09-06T00:08:55.052037566Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 6 00:08:55.052464 containerd[2004]: time="2025-09-06T00:08:55.052318150Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 6 00:08:55.052464 containerd[2004]: time="2025-09-06T00:08:55.052446418Z" level=info msg="metadata content store policy set" policy=shared Sep 6 00:08:55.069406 containerd[2004]: time="2025-09-06T00:08:55.069339082Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 6 00:08:55.069673 containerd[2004]: time="2025-09-06T00:08:55.069543418Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 6 00:08:55.069673 containerd[2004]: time="2025-09-06T00:08:55.069605518Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 6 00:08:55.069673 containerd[2004]: time="2025-09-06T00:08:55.069643870Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 6 00:08:55.069853 containerd[2004]: time="2025-09-06T00:08:55.069677794Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 6 00:08:55.070067 containerd[2004]: time="2025-09-06T00:08:55.069960862Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 6 00:08:55.070389 containerd[2004]: time="2025-09-06T00:08:55.070344850Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 6 00:08:55.070652 containerd[2004]: time="2025-09-06T00:08:55.070608010Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 6 00:08:55.070710 containerd[2004]: time="2025-09-06T00:08:55.070654822Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 6 00:08:55.070710 containerd[2004]: time="2025-09-06T00:08:55.070687714Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 6 00:08:55.070796 containerd[2004]: time="2025-09-06T00:08:55.070720534Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070796 containerd[2004]: time="2025-09-06T00:08:55.070750738Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070796 containerd[2004]: time="2025-09-06T00:08:55.070780246Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070940 containerd[2004]: time="2025-09-06T00:08:55.070812982Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070940 containerd[2004]: time="2025-09-06T00:08:55.070845934Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070940 containerd[2004]: time="2025-09-06T00:08:55.070879594Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.070940 containerd[2004]: time="2025-09-06T00:08:55.070915474Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.070945522Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.070984138Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071015578Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071045878Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071079622Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071109202Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071141278Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071173 containerd[2004]: time="2025-09-06T00:08:55.071169730Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071199670Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071229190Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071263786Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071304430Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071334550Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071363122Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071398798Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071446522Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.071527 containerd[2004]: time="2025-09-06T00:08:55.071482942Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.073870 containerd[2004]: time="2025-09-06T00:08:55.073645726Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 6 00:08:55.075976 containerd[2004]: time="2025-09-06T00:08:55.075571306Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 6 00:08:55.076075 containerd[2004]: time="2025-09-06T00:08:55.075979870Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 6 00:08:55.076075 containerd[2004]: time="2025-09-06T00:08:55.076013074Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 6 00:08:55.076209 containerd[2004]: time="2025-09-06T00:08:55.076069138Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 6 00:08:55.076209 containerd[2004]: time="2025-09-06T00:08:55.076099054Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.076209 containerd[2004]: time="2025-09-06T00:08:55.076170970Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 6 00:08:55.076209 containerd[2004]: time="2025-09-06T00:08:55.076198330Z" level=info msg="NRI interface is disabled by configuration." Sep 6 00:08:55.076408 containerd[2004]: time="2025-09-06T00:08:55.076246810Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 6 00:08:55.081605 containerd[2004]: time="2025-09-06T00:08:55.079099174Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 6 00:08:55.081605 containerd[2004]: time="2025-09-06T00:08:55.079270186Z" level=info msg="Connect containerd service" Sep 6 00:08:55.081605 containerd[2004]: time="2025-09-06T00:08:55.079355266Z" level=info msg="using legacy CRI server" Sep 6 00:08:55.081605 containerd[2004]: time="2025-09-06T00:08:55.079397182Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 6 00:08:55.081605 containerd[2004]: time="2025-09-06T00:08:55.079646374Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 6 00:08:55.087738 containerd[2004]: time="2025-09-06T00:08:55.084919870Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 6 00:08:55.087738 containerd[2004]: time="2025-09-06T00:08:55.085293226Z" level=info msg="Start subscribing containerd event" Sep 6 00:08:55.087738 containerd[2004]: time="2025-09-06T00:08:55.085473814Z" level=info msg="Start recovering state" Sep 6 00:08:55.088805 containerd[2004]: time="2025-09-06T00:08:55.088743490Z" level=info msg="Start event monitor" Sep 6 00:08:55.088909 containerd[2004]: time="2025-09-06T00:08:55.088807450Z" level=info msg="Start snapshots syncer" Sep 6 00:08:55.088909 containerd[2004]: time="2025-09-06T00:08:55.088834258Z" level=info msg="Start cni network conf syncer for default" Sep 6 00:08:55.088909 containerd[2004]: time="2025-09-06T00:08:55.088853914Z" level=info msg="Start streaming server" Sep 6 00:08:55.090018 containerd[2004]: time="2025-09-06T00:08:55.089953342Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 6 00:08:55.090149 containerd[2004]: time="2025-09-06T00:08:55.090083026Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 6 00:08:55.094069 containerd[2004]: time="2025-09-06T00:08:55.090207490Z" level=info msg="containerd successfully booted in 0.247581s" Sep 6 00:08:55.091665 systemd[1]: Started containerd.service - containerd container runtime. Sep 6 00:08:55.144333 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Sep 6 00:08:55.246494 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Sep 6 00:08:55.347039 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] Starting Core Agent Sep 6 00:08:55.447268 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [amazon-ssm-agent] registrar detected. Attempting registration Sep 6 00:08:55.549212 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [Registrar] Starting registrar module Sep 6 00:08:55.649725 amazon-ssm-agent[2060]: 2025-09-06 00:08:54 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Sep 6 00:08:55.874724 tar[2000]: linux-arm64/README.md Sep 6 00:08:55.919392 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 6 00:08:56.018222 amazon-ssm-agent[2060]: 2025-09-06 00:08:56 INFO [EC2Identity] EC2 registration was successful. Sep 6 00:08:56.055193 amazon-ssm-agent[2060]: 2025-09-06 00:08:56 INFO [CredentialRefresher] credentialRefresher has started Sep 6 00:08:56.056281 amazon-ssm-agent[2060]: 2025-09-06 00:08:56 INFO [CredentialRefresher] Starting credentials refresher loop Sep 6 00:08:56.056431 amazon-ssm-agent[2060]: 2025-09-06 00:08:56 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 6 00:08:56.118213 amazon-ssm-agent[2060]: 2025-09-06 00:08:56 INFO [CredentialRefresher] Next credential rotation will be in 30.8499707244 minutes Sep 6 00:08:56.469805 ntpd[1984]: Listen normally on 6 eth0 [fe80::43e:27ff:fedf:247%2]:123 Sep 6 00:08:56.470895 ntpd[1984]: 6 Sep 00:08:56 ntpd[1984]: Listen normally on 6 eth0 [fe80::43e:27ff:fedf:247%2]:123 Sep 6 00:08:56.522943 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:08:56.534113 (kubelet)[2215]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 00:08:57.115467 amazon-ssm-agent[2060]: 2025-09-06 00:08:57 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 6 00:08:57.217316 amazon-ssm-agent[2060]: 2025-09-06 00:08:57 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2222) started Sep 6 00:08:57.318547 amazon-ssm-agent[2060]: 2025-09-06 00:08:57 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 6 00:08:57.530362 kubelet[2215]: E0906 00:08:57.530230 2215 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 00:08:57.535988 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 00:08:57.537311 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 00:08:57.538057 systemd[1]: kubelet.service: Consumed 1.375s CPU time. Sep 6 00:08:59.498206 sshd_keygen[2025]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 6 00:08:59.538604 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 6 00:08:59.549094 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 6 00:08:59.563183 systemd[1]: Started sshd@0-172.31.26.146:22-139.178.68.195:60894.service - OpenSSH per-connection server daemon (139.178.68.195:60894). Sep 6 00:08:59.579198 systemd[1]: issuegen.service: Deactivated successfully. Sep 6 00:08:59.579627 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 6 00:08:59.595708 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 6 00:08:59.625071 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 6 00:08:59.633163 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 6 00:08:59.643162 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 6 00:08:59.647805 systemd[1]: Reached target getty.target - Login Prompts. Sep 6 00:08:59.652576 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 6 00:08:59.656249 systemd[1]: Startup finished in 1.177s (kernel) + 8.741s (initrd) + 12.152s (userspace) = 22.071s. Sep 6 00:08:59.799754 sshd[2241]: Accepted publickey for core from 139.178.68.195 port 60894 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:08:59.802881 sshd[2241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:08:59.819476 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 6 00:08:59.826034 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 6 00:08:59.830318 systemd-logind[1990]: New session 1 of user core. Sep 6 00:08:59.863762 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 6 00:08:59.873074 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 6 00:08:59.893995 (systemd)[2256]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 6 00:09:00.123431 systemd[2256]: Queued start job for default target default.target. Sep 6 00:09:00.130580 systemd[2256]: Created slice app.slice - User Application Slice. Sep 6 00:09:00.130648 systemd[2256]: Reached target paths.target - Paths. Sep 6 00:09:00.130682 systemd[2256]: Reached target timers.target - Timers. Sep 6 00:09:00.133179 systemd[2256]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 6 00:09:00.154761 systemd[2256]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 6 00:09:00.155035 systemd[2256]: Reached target sockets.target - Sockets. Sep 6 00:09:00.155069 systemd[2256]: Reached target basic.target - Basic System. Sep 6 00:09:00.155162 systemd[2256]: Reached target default.target - Main User Target. Sep 6 00:09:00.155227 systemd[2256]: Startup finished in 250ms. Sep 6 00:09:00.155392 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 6 00:09:00.163808 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 6 00:09:00.324246 systemd[1]: Started sshd@1-172.31.26.146:22-139.178.68.195:38084.service - OpenSSH per-connection server daemon (139.178.68.195:38084). Sep 6 00:09:00.791398 systemd-resolved[1810]: Clock change detected. Flushing caches. Sep 6 00:09:00.826532 sshd[2267]: Accepted publickey for core from 139.178.68.195 port 38084 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:00.829198 sshd[2267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:00.838050 systemd-logind[1990]: New session 2 of user core. Sep 6 00:09:00.841441 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 6 00:09:00.967538 sshd[2267]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:00.973464 systemd-logind[1990]: Session 2 logged out. Waiting for processes to exit. Sep 6 00:09:00.975257 systemd[1]: sshd@1-172.31.26.146:22-139.178.68.195:38084.service: Deactivated successfully. Sep 6 00:09:00.978941 systemd[1]: session-2.scope: Deactivated successfully. Sep 6 00:09:00.980477 systemd-logind[1990]: Removed session 2. Sep 6 00:09:01.004677 systemd[1]: Started sshd@2-172.31.26.146:22-139.178.68.195:38090.service - OpenSSH per-connection server daemon (139.178.68.195:38090). Sep 6 00:09:01.184894 sshd[2274]: Accepted publickey for core from 139.178.68.195 port 38090 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:01.186831 sshd[2274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:01.195539 systemd-logind[1990]: New session 3 of user core. Sep 6 00:09:01.205426 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 6 00:09:01.324848 sshd[2274]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:01.330648 systemd[1]: sshd@2-172.31.26.146:22-139.178.68.195:38090.service: Deactivated successfully. Sep 6 00:09:01.334194 systemd[1]: session-3.scope: Deactivated successfully. Sep 6 00:09:01.335615 systemd-logind[1990]: Session 3 logged out. Waiting for processes to exit. Sep 6 00:09:01.337696 systemd-logind[1990]: Removed session 3. Sep 6 00:09:01.367677 systemd[1]: Started sshd@3-172.31.26.146:22-139.178.68.195:38106.service - OpenSSH per-connection server daemon (139.178.68.195:38106). Sep 6 00:09:01.535083 sshd[2281]: Accepted publickey for core from 139.178.68.195 port 38106 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:01.537589 sshd[2281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:01.546467 systemd-logind[1990]: New session 4 of user core. Sep 6 00:09:01.554402 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 6 00:09:01.680210 sshd[2281]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:01.685931 systemd[1]: sshd@3-172.31.26.146:22-139.178.68.195:38106.service: Deactivated successfully. Sep 6 00:09:01.689112 systemd[1]: session-4.scope: Deactivated successfully. Sep 6 00:09:01.691188 systemd-logind[1990]: Session 4 logged out. Waiting for processes to exit. Sep 6 00:09:01.692978 systemd-logind[1990]: Removed session 4. Sep 6 00:09:01.721651 systemd[1]: Started sshd@4-172.31.26.146:22-139.178.68.195:38112.service - OpenSSH per-connection server daemon (139.178.68.195:38112). Sep 6 00:09:01.889158 sshd[2288]: Accepted publickey for core from 139.178.68.195 port 38112 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:01.891979 sshd[2288]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:01.899539 systemd-logind[1990]: New session 5 of user core. Sep 6 00:09:01.907429 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 6 00:09:02.048518 sudo[2291]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 6 00:09:02.049200 sudo[2291]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 00:09:02.064257 sudo[2291]: pam_unix(sudo:session): session closed for user root Sep 6 00:09:02.088249 sshd[2288]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:02.094338 systemd-logind[1990]: Session 5 logged out. Waiting for processes to exit. Sep 6 00:09:02.095256 systemd[1]: sshd@4-172.31.26.146:22-139.178.68.195:38112.service: Deactivated successfully. Sep 6 00:09:02.097881 systemd[1]: session-5.scope: Deactivated successfully. Sep 6 00:09:02.101782 systemd-logind[1990]: Removed session 5. Sep 6 00:09:02.127641 systemd[1]: Started sshd@5-172.31.26.146:22-139.178.68.195:38128.service - OpenSSH per-connection server daemon (139.178.68.195:38128). Sep 6 00:09:02.306819 sshd[2297]: Accepted publickey for core from 139.178.68.195 port 38128 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:02.309784 sshd[2297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:02.319465 systemd-logind[1990]: New session 6 of user core. Sep 6 00:09:02.323434 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 6 00:09:02.428524 sudo[2301]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 6 00:09:02.429138 sudo[2301]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 00:09:02.435313 sudo[2301]: pam_unix(sudo:session): session closed for user root Sep 6 00:09:02.445366 sudo[2300]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 6 00:09:02.446039 sudo[2300]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 00:09:02.467679 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 6 00:09:02.480089 auditctl[2304]: No rules Sep 6 00:09:02.481196 systemd[1]: audit-rules.service: Deactivated successfully. Sep 6 00:09:02.481551 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 6 00:09:02.494820 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 6 00:09:02.536663 augenrules[2322]: No rules Sep 6 00:09:02.540237 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 6 00:09:02.542909 sudo[2300]: pam_unix(sudo:session): session closed for user root Sep 6 00:09:02.566620 sshd[2297]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:02.573385 systemd[1]: sshd@5-172.31.26.146:22-139.178.68.195:38128.service: Deactivated successfully. Sep 6 00:09:02.577042 systemd[1]: session-6.scope: Deactivated successfully. Sep 6 00:09:02.579322 systemd-logind[1990]: Session 6 logged out. Waiting for processes to exit. Sep 6 00:09:02.581311 systemd-logind[1990]: Removed session 6. Sep 6 00:09:02.605661 systemd[1]: Started sshd@6-172.31.26.146:22-139.178.68.195:38138.service - OpenSSH per-connection server daemon (139.178.68.195:38138). Sep 6 00:09:02.790520 sshd[2330]: Accepted publickey for core from 139.178.68.195 port 38138 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:09:02.793064 sshd[2330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:09:02.801501 systemd-logind[1990]: New session 7 of user core. Sep 6 00:09:02.811445 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 6 00:09:02.917271 sudo[2333]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 6 00:09:02.917983 sudo[2333]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 00:09:03.898645 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 6 00:09:03.900980 (dockerd)[2349]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 6 00:09:04.453660 dockerd[2349]: time="2025-09-06T00:09:04.453563822Z" level=info msg="Starting up" Sep 6 00:09:04.697961 dockerd[2349]: time="2025-09-06T00:09:04.697879875Z" level=info msg="Loading containers: start." Sep 6 00:09:04.931204 kernel: Initializing XFRM netlink socket Sep 6 00:09:04.974930 (udev-worker)[2371]: Network interface NamePolicy= disabled on kernel command line. Sep 6 00:09:05.064411 systemd-networkd[1845]: docker0: Link UP Sep 6 00:09:05.092516 dockerd[2349]: time="2025-09-06T00:09:05.092442961Z" level=info msg="Loading containers: done." Sep 6 00:09:05.114632 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck108147087-merged.mount: Deactivated successfully. Sep 6 00:09:05.129879 dockerd[2349]: time="2025-09-06T00:09:05.129816721Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 6 00:09:05.130254 dockerd[2349]: time="2025-09-06T00:09:05.129965413Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 6 00:09:05.130254 dockerd[2349]: time="2025-09-06T00:09:05.130171393Z" level=info msg="Daemon has completed initialization" Sep 6 00:09:05.202547 dockerd[2349]: time="2025-09-06T00:09:05.202231621Z" level=info msg="API listen on /run/docker.sock" Sep 6 00:09:05.204450 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 6 00:09:06.379258 containerd[2004]: time="2025-09-06T00:09:06.378794319Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 6 00:09:07.085512 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1721679468.mount: Deactivated successfully. Sep 6 00:09:08.108377 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 6 00:09:08.117542 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:08.526655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:08.537046 (kubelet)[2550]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 00:09:08.652183 kubelet[2550]: E0906 00:09:08.651338 2550 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 00:09:08.659061 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 00:09:08.659572 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 00:09:08.750215 containerd[2004]: time="2025-09-06T00:09:08.749587207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:08.751883 containerd[2004]: time="2025-09-06T00:09:08.751806343Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328357" Sep 6 00:09:08.755059 containerd[2004]: time="2025-09-06T00:09:08.754990483Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:08.761273 containerd[2004]: time="2025-09-06T00:09:08.760469431Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:08.762803 containerd[2004]: time="2025-09-06T00:09:08.762737467Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 2.383873272s" Sep 6 00:09:08.762923 containerd[2004]: time="2025-09-06T00:09:08.762802063Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 6 00:09:08.763825 containerd[2004]: time="2025-09-06T00:09:08.763772635Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 6 00:09:10.417192 containerd[2004]: time="2025-09-06T00:09:10.416266183Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:10.419681 containerd[2004]: time="2025-09-06T00:09:10.419635483Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528552" Sep 6 00:09:10.420237 containerd[2004]: time="2025-09-06T00:09:10.420196099Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:10.425850 containerd[2004]: time="2025-09-06T00:09:10.425788387Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:10.428222 containerd[2004]: time="2025-09-06T00:09:10.428135251Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.664304884s" Sep 6 00:09:10.428367 containerd[2004]: time="2025-09-06T00:09:10.428219623Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 6 00:09:10.430187 containerd[2004]: time="2025-09-06T00:09:10.429410635Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 6 00:09:11.869214 containerd[2004]: time="2025-09-06T00:09:11.868667567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:11.871123 containerd[2004]: time="2025-09-06T00:09:11.870882863Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483527" Sep 6 00:09:11.875175 containerd[2004]: time="2025-09-06T00:09:11.873373007Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:11.879720 containerd[2004]: time="2025-09-06T00:09:11.879655631Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:11.881898 containerd[2004]: time="2025-09-06T00:09:11.881842067Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.452367364s" Sep 6 00:09:11.882079 containerd[2004]: time="2025-09-06T00:09:11.882046571Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 6 00:09:11.883028 containerd[2004]: time="2025-09-06T00:09:11.882924203Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 6 00:09:13.315755 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount130744445.mount: Deactivated successfully. Sep 6 00:09:13.877487 containerd[2004]: time="2025-09-06T00:09:13.877403304Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:13.879785 containerd[2004]: time="2025-09-06T00:09:13.879716785Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376724" Sep 6 00:09:13.881409 containerd[2004]: time="2025-09-06T00:09:13.881340913Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:13.885174 containerd[2004]: time="2025-09-06T00:09:13.884531941Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:13.886087 containerd[2004]: time="2025-09-06T00:09:13.886029349Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 2.001950962s" Sep 6 00:09:13.886232 containerd[2004]: time="2025-09-06T00:09:13.886085737Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 6 00:09:13.886909 containerd[2004]: time="2025-09-06T00:09:13.886855141Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 6 00:09:14.496371 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2016436700.mount: Deactivated successfully. Sep 6 00:09:15.733188 containerd[2004]: time="2025-09-06T00:09:15.731412458Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:15.735378 containerd[2004]: time="2025-09-06T00:09:15.735329246Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 6 00:09:15.738563 containerd[2004]: time="2025-09-06T00:09:15.738515522Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:15.745316 containerd[2004]: time="2025-09-06T00:09:15.745250606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:15.748125 containerd[2004]: time="2025-09-06T00:09:15.748049942Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.861132173s" Sep 6 00:09:15.748125 containerd[2004]: time="2025-09-06T00:09:15.748120574Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 6 00:09:15.748948 containerd[2004]: time="2025-09-06T00:09:15.748898078Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 6 00:09:16.381531 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1107861281.mount: Deactivated successfully. Sep 6 00:09:16.397213 containerd[2004]: time="2025-09-06T00:09:16.396259729Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:16.398828 containerd[2004]: time="2025-09-06T00:09:16.398413669Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 6 00:09:16.401113 containerd[2004]: time="2025-09-06T00:09:16.401015617Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:16.413208 containerd[2004]: time="2025-09-06T00:09:16.411286261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:16.413856 containerd[2004]: time="2025-09-06T00:09:16.413785177Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 664.824735ms" Sep 6 00:09:16.413970 containerd[2004]: time="2025-09-06T00:09:16.413853253Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 6 00:09:16.414587 containerd[2004]: time="2025-09-06T00:09:16.414467161Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 6 00:09:17.162185 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3938607623.mount: Deactivated successfully. Sep 6 00:09:18.909850 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 6 00:09:18.924552 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:19.313378 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:19.325705 (kubelet)[2690]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 00:09:19.421106 kubelet[2690]: E0906 00:09:19.420123 2690 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 00:09:19.425084 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 00:09:19.425477 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 00:09:19.660162 containerd[2004]: time="2025-09-06T00:09:19.659262485Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:19.662445 containerd[2004]: time="2025-09-06T00:09:19.662373713Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943165" Sep 6 00:09:19.665054 containerd[2004]: time="2025-09-06T00:09:19.664981949Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:19.673979 containerd[2004]: time="2025-09-06T00:09:19.673874429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:19.676496 containerd[2004]: time="2025-09-06T00:09:19.676431377Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.261904984s" Sep 6 00:09:19.677004 containerd[2004]: time="2025-09-06T00:09:19.676658105Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 6 00:09:25.089901 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 6 00:09:26.677192 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:26.688666 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:26.751953 systemd[1]: Reloading requested from client PID 2730 ('systemctl') (unit session-7.scope)... Sep 6 00:09:26.752201 systemd[1]: Reloading... Sep 6 00:09:26.987215 zram_generator::config[2773]: No configuration found. Sep 6 00:09:27.225327 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 6 00:09:27.398619 systemd[1]: Reloading finished in 645 ms. Sep 6 00:09:27.489126 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:27.497929 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:27.501126 systemd[1]: kubelet.service: Deactivated successfully. Sep 6 00:09:27.501646 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:27.509677 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:27.835021 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:27.851657 (kubelet)[2835]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 6 00:09:27.927625 kubelet[2835]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 00:09:27.928113 kubelet[2835]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 6 00:09:27.928242 kubelet[2835]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 00:09:27.928514 kubelet[2835]: I0906 00:09:27.928462 2835 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 6 00:09:29.020047 kubelet[2835]: I0906 00:09:29.019951 2835 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 6 00:09:29.020047 kubelet[2835]: I0906 00:09:29.020004 2835 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 6 00:09:29.020737 kubelet[2835]: I0906 00:09:29.020527 2835 server.go:954] "Client rotation is on, will bootstrap in background" Sep 6 00:09:29.180114 kubelet[2835]: E0906 00:09:29.180053 2835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.26.146:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:29.189680 kubelet[2835]: I0906 00:09:29.189357 2835 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 6 00:09:29.194912 kubelet[2835]: E0906 00:09:29.194843 2835 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 6 00:09:29.194912 kubelet[2835]: I0906 00:09:29.194895 2835 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 6 00:09:29.204185 kubelet[2835]: I0906 00:09:29.203842 2835 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 6 00:09:29.205553 kubelet[2835]: I0906 00:09:29.205483 2835 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 6 00:09:29.205933 kubelet[2835]: I0906 00:09:29.205659 2835 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-26-146","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 6 00:09:29.206792 kubelet[2835]: I0906 00:09:29.206264 2835 topology_manager.go:138] "Creating topology manager with none policy" Sep 6 00:09:29.206792 kubelet[2835]: I0906 00:09:29.206289 2835 container_manager_linux.go:304] "Creating device plugin manager" Sep 6 00:09:29.206792 kubelet[2835]: I0906 00:09:29.206643 2835 state_mem.go:36] "Initialized new in-memory state store" Sep 6 00:09:29.214369 kubelet[2835]: I0906 00:09:29.214308 2835 kubelet.go:446] "Attempting to sync node with API server" Sep 6 00:09:29.214369 kubelet[2835]: I0906 00:09:29.214358 2835 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 6 00:09:29.214559 kubelet[2835]: I0906 00:09:29.214396 2835 kubelet.go:352] "Adding apiserver pod source" Sep 6 00:09:29.214559 kubelet[2835]: I0906 00:09:29.214418 2835 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 6 00:09:29.217082 kubelet[2835]: W0906 00:09:29.216555 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.26.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-146&limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:29.217082 kubelet[2835]: E0906 00:09:29.216682 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.26.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-146&limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:29.221449 kubelet[2835]: W0906 00:09:29.221381 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.26.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:29.221833 kubelet[2835]: E0906 00:09:29.221671 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.26.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:29.222591 kubelet[2835]: I0906 00:09:29.222559 2835 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 6 00:09:29.224181 kubelet[2835]: I0906 00:09:29.223736 2835 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 6 00:09:29.224181 kubelet[2835]: W0906 00:09:29.223962 2835 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 6 00:09:29.227787 kubelet[2835]: I0906 00:09:29.227730 2835 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 6 00:09:29.227971 kubelet[2835]: I0906 00:09:29.227952 2835 server.go:1287] "Started kubelet" Sep 6 00:09:29.234720 kubelet[2835]: I0906 00:09:29.234682 2835 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 6 00:09:29.240094 kubelet[2835]: I0906 00:09:29.240010 2835 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 6 00:09:29.241691 kubelet[2835]: I0906 00:09:29.241645 2835 server.go:479] "Adding debug handlers to kubelet server" Sep 6 00:09:29.243437 kubelet[2835]: I0906 00:09:29.243346 2835 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 6 00:09:29.243752 kubelet[2835]: I0906 00:09:29.243712 2835 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 6 00:09:29.246596 kubelet[2835]: I0906 00:09:29.246556 2835 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 6 00:09:29.248453 kubelet[2835]: E0906 00:09:29.248286 2835 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-26-146\" not found" Sep 6 00:09:29.251369 kubelet[2835]: I0906 00:09:29.251333 2835 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 6 00:09:29.254205 kubelet[2835]: I0906 00:09:29.252817 2835 factory.go:221] Registration of the systemd container factory successfully Sep 6 00:09:29.254205 kubelet[2835]: I0906 00:09:29.252990 2835 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 6 00:09:29.254205 kubelet[2835]: E0906 00:09:29.254041 2835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": dial tcp 172.31.26.146:6443: connect: connection refused" interval="200ms" Sep 6 00:09:29.256642 kubelet[2835]: I0906 00:09:29.256595 2835 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 6 00:09:29.256642 kubelet[2835]: E0906 00:09:29.256069 2835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.26.146:6443/api/v1/namespaces/default/events\": dial tcp 172.31.26.146:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-26-146.186288f00d7fc571 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-26-146,UID:ip-172-31-26-146,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-26-146,},FirstTimestamp:2025-09-06 00:09:29.227920753 +0000 UTC m=+1.369162904,LastTimestamp:2025-09-06 00:09:29.227920753 +0000 UTC m=+1.369162904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-26-146,}" Sep 6 00:09:29.256891 kubelet[2835]: I0906 00:09:29.256731 2835 reconciler.go:26] "Reconciler: start to sync state" Sep 6 00:09:29.257740 kubelet[2835]: E0906 00:09:29.257698 2835 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 6 00:09:29.259103 kubelet[2835]: I0906 00:09:29.259058 2835 factory.go:221] Registration of the containerd container factory successfully Sep 6 00:09:29.280343 kubelet[2835]: W0906 00:09:29.279996 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.26.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:29.280343 kubelet[2835]: E0906 00:09:29.280179 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.26.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:29.285308 kubelet[2835]: I0906 00:09:29.285023 2835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 6 00:09:29.288546 kubelet[2835]: I0906 00:09:29.288505 2835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 6 00:09:29.288816 kubelet[2835]: I0906 00:09:29.288794 2835 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 6 00:09:29.288972 kubelet[2835]: I0906 00:09:29.288951 2835 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 6 00:09:29.289986 kubelet[2835]: I0906 00:09:29.289049 2835 kubelet.go:2382] "Starting kubelet main sync loop" Sep 6 00:09:29.289986 kubelet[2835]: E0906 00:09:29.289124 2835 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 6 00:09:29.289986 kubelet[2835]: W0906 00:09:29.289875 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.26.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:29.289986 kubelet[2835]: E0906 00:09:29.289944 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.26.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:29.303942 kubelet[2835]: I0906 00:09:29.303890 2835 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 6 00:09:29.303942 kubelet[2835]: I0906 00:09:29.303930 2835 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 6 00:09:29.304133 kubelet[2835]: I0906 00:09:29.303992 2835 state_mem.go:36] "Initialized new in-memory state store" Sep 6 00:09:29.310027 kubelet[2835]: I0906 00:09:29.309964 2835 policy_none.go:49] "None policy: Start" Sep 6 00:09:29.310027 kubelet[2835]: I0906 00:09:29.310017 2835 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 6 00:09:29.310263 kubelet[2835]: I0906 00:09:29.310043 2835 state_mem.go:35] "Initializing new in-memory state store" Sep 6 00:09:29.323253 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 6 00:09:29.337520 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 6 00:09:29.346089 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 6 00:09:29.348735 kubelet[2835]: E0906 00:09:29.348662 2835 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-26-146\" not found" Sep 6 00:09:29.356724 kubelet[2835]: I0906 00:09:29.356689 2835 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 6 00:09:29.358321 kubelet[2835]: I0906 00:09:29.357115 2835 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 6 00:09:29.358321 kubelet[2835]: I0906 00:09:29.357178 2835 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 6 00:09:29.358321 kubelet[2835]: I0906 00:09:29.357480 2835 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 6 00:09:29.359848 kubelet[2835]: E0906 00:09:29.359811 2835 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 6 00:09:29.360102 kubelet[2835]: E0906 00:09:29.360075 2835 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-26-146\" not found" Sep 6 00:09:29.411661 systemd[1]: Created slice kubepods-burstable-pod8a5c9410e7c7140fb3f4768ba731efe1.slice - libcontainer container kubepods-burstable-pod8a5c9410e7c7140fb3f4768ba731efe1.slice. Sep 6 00:09:29.426990 kubelet[2835]: E0906 00:09:29.426630 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:29.435079 systemd[1]: Created slice kubepods-burstable-pode7ce45773dca75e5c299ea76d927be08.slice - libcontainer container kubepods-burstable-pode7ce45773dca75e5c299ea76d927be08.slice. Sep 6 00:09:29.440186 kubelet[2835]: E0906 00:09:29.440052 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:29.444707 systemd[1]: Created slice kubepods-burstable-podec6e412e6a92edfbc1fab4c3b9e2d8d8.slice - libcontainer container kubepods-burstable-podec6e412e6a92edfbc1fab4c3b9e2d8d8.slice. Sep 6 00:09:29.448338 kubelet[2835]: E0906 00:09:29.448293 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:29.454768 kubelet[2835]: E0906 00:09:29.454693 2835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": dial tcp 172.31.26.146:6443: connect: connection refused" interval="400ms" Sep 6 00:09:29.459437 kubelet[2835]: I0906 00:09:29.458804 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:29.459437 kubelet[2835]: I0906 00:09:29.458912 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-kubeconfig\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:29.459437 kubelet[2835]: I0906 00:09:29.458995 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:29.459437 kubelet[2835]: I0906 00:09:29.459064 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec6e412e6a92edfbc1fab4c3b9e2d8d8-kubeconfig\") pod \"kube-scheduler-ip-172-31-26-146\" (UID: \"ec6e412e6a92edfbc1fab4c3b9e2d8d8\") " pod="kube-system/kube-scheduler-ip-172-31-26-146" Sep 6 00:09:29.459437 kubelet[2835]: I0906 00:09:29.459119 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-ca-certs\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:29.459742 kubelet[2835]: I0906 00:09:29.459196 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-k8s-certs\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:29.459742 kubelet[2835]: I0906 00:09:29.459257 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-ca-certs\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:29.459742 kubelet[2835]: I0906 00:09:29.459296 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:29.459742 kubelet[2835]: I0906 00:09:29.459334 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-k8s-certs\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:29.460679 kubelet[2835]: I0906 00:09:29.460648 2835 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:29.461753 kubelet[2835]: E0906 00:09:29.461699 2835 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.26.146:6443/api/v1/nodes\": dial tcp 172.31.26.146:6443: connect: connection refused" node="ip-172-31-26-146" Sep 6 00:09:29.665203 kubelet[2835]: I0906 00:09:29.663575 2835 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:29.665203 kubelet[2835]: E0906 00:09:29.664035 2835 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.26.146:6443/api/v1/nodes\": dial tcp 172.31.26.146:6443: connect: connection refused" node="ip-172-31-26-146" Sep 6 00:09:29.729104 containerd[2004]: time="2025-09-06T00:09:29.729053043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-26-146,Uid:8a5c9410e7c7140fb3f4768ba731efe1,Namespace:kube-system,Attempt:0,}" Sep 6 00:09:29.742338 containerd[2004]: time="2025-09-06T00:09:29.742265859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-26-146,Uid:e7ce45773dca75e5c299ea76d927be08,Namespace:kube-system,Attempt:0,}" Sep 6 00:09:29.750176 containerd[2004]: time="2025-09-06T00:09:29.749803731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-26-146,Uid:ec6e412e6a92edfbc1fab4c3b9e2d8d8,Namespace:kube-system,Attempt:0,}" Sep 6 00:09:29.855324 kubelet[2835]: E0906 00:09:29.855274 2835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": dial tcp 172.31.26.146:6443: connect: connection refused" interval="800ms" Sep 6 00:09:30.066275 kubelet[2835]: I0906 00:09:30.066228 2835 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:30.067023 kubelet[2835]: E0906 00:09:30.066974 2835 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.26.146:6443/api/v1/nodes\": dial tcp 172.31.26.146:6443: connect: connection refused" node="ip-172-31-26-146" Sep 6 00:09:30.135897 kubelet[2835]: W0906 00:09:30.135703 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.26.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-146&limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:30.135897 kubelet[2835]: E0906 00:09:30.135808 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.26.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-146&limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:30.177870 kubelet[2835]: W0906 00:09:30.177715 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.26.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:30.177870 kubelet[2835]: E0906 00:09:30.177810 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.26.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:30.199887 kubelet[2835]: W0906 00:09:30.199804 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.26.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:30.200259 kubelet[2835]: E0906 00:09:30.200195 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.26.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:30.300553 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3738287372.mount: Deactivated successfully. Sep 6 00:09:30.317337 containerd[2004]: time="2025-09-06T00:09:30.316681754Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 00:09:30.319397 containerd[2004]: time="2025-09-06T00:09:30.319326722Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 00:09:30.321325 containerd[2004]: time="2025-09-06T00:09:30.321227210Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Sep 6 00:09:30.323286 containerd[2004]: time="2025-09-06T00:09:30.323235206Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 6 00:09:30.325400 containerd[2004]: time="2025-09-06T00:09:30.325349750Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 00:09:30.328461 containerd[2004]: time="2025-09-06T00:09:30.328141454Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 00:09:30.329860 containerd[2004]: time="2025-09-06T00:09:30.329758430Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 6 00:09:30.334182 containerd[2004]: time="2025-09-06T00:09:30.334078202Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 00:09:30.338421 containerd[2004]: time="2025-09-06T00:09:30.338092550Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 595.715427ms" Sep 6 00:09:30.342501 containerd[2004]: time="2025-09-06T00:09:30.342422762Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 612.636087ms" Sep 6 00:09:30.359771 kubelet[2835]: W0906 00:09:30.359664 2835 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.26.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.26.146:6443: connect: connection refused Sep 6 00:09:30.359948 kubelet[2835]: E0906 00:09:30.359793 2835 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.26.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:30.363340 containerd[2004]: time="2025-09-06T00:09:30.362540666Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 612.628539ms" Sep 6 00:09:30.643006 kubelet[2835]: E0906 00:09:30.642725 2835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.26.146:6443/api/v1/namespaces/default/events\": dial tcp 172.31.26.146:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-26-146.186288f00d7fc571 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-26-146,UID:ip-172-31-26-146,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-26-146,},FirstTimestamp:2025-09-06 00:09:29.227920753 +0000 UTC m=+1.369162904,LastTimestamp:2025-09-06 00:09:29.227920753 +0000 UTC m=+1.369162904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-26-146,}" Sep 6 00:09:30.656930 kubelet[2835]: E0906 00:09:30.656869 2835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": dial tcp 172.31.26.146:6443: connect: connection refused" interval="1.6s" Sep 6 00:09:30.733633 containerd[2004]: time="2025-09-06T00:09:30.733393552Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:09:30.734610 containerd[2004]: time="2025-09-06T00:09:30.733641784Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:09:30.734610 containerd[2004]: time="2025-09-06T00:09:30.733717936Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.734610 containerd[2004]: time="2025-09-06T00:09:30.734084764Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.744192 containerd[2004]: time="2025-09-06T00:09:30.743286172Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:09:30.746506 containerd[2004]: time="2025-09-06T00:09:30.746313400Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:09:30.746907 containerd[2004]: time="2025-09-06T00:09:30.746383432Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.747950 containerd[2004]: time="2025-09-06T00:09:30.747727108Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.751338 containerd[2004]: time="2025-09-06T00:09:30.743024548Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:09:30.751338 containerd[2004]: time="2025-09-06T00:09:30.751186600Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:09:30.751338 containerd[2004]: time="2025-09-06T00:09:30.751216984Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.752073 containerd[2004]: time="2025-09-06T00:09:30.751694860Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:30.787899 systemd[1]: Started cri-containerd-b76648feda4025f4d47493096f10a1bc9d34ce5b6c2c1b673f92e98c1d6d351c.scope - libcontainer container b76648feda4025f4d47493096f10a1bc9d34ce5b6c2c1b673f92e98c1d6d351c. Sep 6 00:09:30.806990 systemd[1]: Started cri-containerd-cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00.scope - libcontainer container cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00. Sep 6 00:09:30.822643 systemd[1]: Started cri-containerd-32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436.scope - libcontainer container 32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436. Sep 6 00:09:30.872248 kubelet[2835]: I0906 00:09:30.871733 2835 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:30.872861 kubelet[2835]: E0906 00:09:30.872748 2835 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.26.146:6443/api/v1/nodes\": dial tcp 172.31.26.146:6443: connect: connection refused" node="ip-172-31-26-146" Sep 6 00:09:30.919879 containerd[2004]: time="2025-09-06T00:09:30.919606277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-26-146,Uid:8a5c9410e7c7140fb3f4768ba731efe1,Namespace:kube-system,Attempt:0,} returns sandbox id \"b76648feda4025f4d47493096f10a1bc9d34ce5b6c2c1b673f92e98c1d6d351c\"" Sep 6 00:09:30.932018 containerd[2004]: time="2025-09-06T00:09:30.931191905Z" level=info msg="CreateContainer within sandbox \"b76648feda4025f4d47493096f10a1bc9d34ce5b6c2c1b673f92e98c1d6d351c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 6 00:09:30.956041 containerd[2004]: time="2025-09-06T00:09:30.955965653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-26-146,Uid:ec6e412e6a92edfbc1fab4c3b9e2d8d8,Namespace:kube-system,Attempt:0,} returns sandbox id \"cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00\"" Sep 6 00:09:30.961650 containerd[2004]: time="2025-09-06T00:09:30.961579613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-26-146,Uid:e7ce45773dca75e5c299ea76d927be08,Namespace:kube-system,Attempt:0,} returns sandbox id \"32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436\"" Sep 6 00:09:30.964666 containerd[2004]: time="2025-09-06T00:09:30.964502693Z" level=info msg="CreateContainer within sandbox \"cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 6 00:09:30.971401 containerd[2004]: time="2025-09-06T00:09:30.971044733Z" level=info msg="CreateContainer within sandbox \"b76648feda4025f4d47493096f10a1bc9d34ce5b6c2c1b673f92e98c1d6d351c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4e56507eea9465c7e7c97c86818bdffb56f759b5e2dc736fa42161f80d746ae2\"" Sep 6 00:09:30.982491 containerd[2004]: time="2025-09-06T00:09:30.982441613Z" level=info msg="StartContainer for \"4e56507eea9465c7e7c97c86818bdffb56f759b5e2dc736fa42161f80d746ae2\"" Sep 6 00:09:30.986208 containerd[2004]: time="2025-09-06T00:09:30.985767857Z" level=info msg="CreateContainer within sandbox \"32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 6 00:09:30.998764 containerd[2004]: time="2025-09-06T00:09:30.998692002Z" level=info msg="CreateContainer within sandbox \"cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea\"" Sep 6 00:09:31.000056 containerd[2004]: time="2025-09-06T00:09:30.999994134Z" level=info msg="StartContainer for \"41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea\"" Sep 6 00:09:31.031797 containerd[2004]: time="2025-09-06T00:09:31.031595654Z" level=info msg="CreateContainer within sandbox \"32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc\"" Sep 6 00:09:31.034766 containerd[2004]: time="2025-09-06T00:09:31.034310042Z" level=info msg="StartContainer for \"2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc\"" Sep 6 00:09:31.050484 systemd[1]: Started cri-containerd-4e56507eea9465c7e7c97c86818bdffb56f759b5e2dc736fa42161f80d746ae2.scope - libcontainer container 4e56507eea9465c7e7c97c86818bdffb56f759b5e2dc736fa42161f80d746ae2. Sep 6 00:09:31.083969 systemd[1]: Started cri-containerd-41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea.scope - libcontainer container 41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea. Sep 6 00:09:31.120450 systemd[1]: Started cri-containerd-2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc.scope - libcontainer container 2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc. Sep 6 00:09:31.173234 containerd[2004]: time="2025-09-06T00:09:31.173030402Z" level=info msg="StartContainer for \"4e56507eea9465c7e7c97c86818bdffb56f759b5e2dc736fa42161f80d746ae2\" returns successfully" Sep 6 00:09:31.231330 containerd[2004]: time="2025-09-06T00:09:31.231261303Z" level=info msg="StartContainer for \"2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc\" returns successfully" Sep 6 00:09:31.255186 kubelet[2835]: E0906 00:09:31.254429 2835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.26.146:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.26.146:6443: connect: connection refused" logger="UnhandledError" Sep 6 00:09:31.275649 containerd[2004]: time="2025-09-06T00:09:31.275442051Z" level=info msg="StartContainer for \"41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea\" returns successfully" Sep 6 00:09:31.310777 kubelet[2835]: E0906 00:09:31.310715 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:31.320597 kubelet[2835]: E0906 00:09:31.320541 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:31.343184 kubelet[2835]: E0906 00:09:31.341374 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:32.331921 kubelet[2835]: E0906 00:09:32.331866 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:32.332531 kubelet[2835]: E0906 00:09:32.332438 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:32.479169 kubelet[2835]: I0906 00:09:32.476808 2835 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:33.240947 kubelet[2835]: E0906 00:09:33.240897 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:33.335189 kubelet[2835]: E0906 00:09:33.333269 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:34.986468 kubelet[2835]: E0906 00:09:34.986206 2835 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:35.655237 kubelet[2835]: E0906 00:09:35.655174 2835 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-26-146\" not found" node="ip-172-31-26-146" Sep 6 00:09:35.729566 kubelet[2835]: I0906 00:09:35.728423 2835 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-26-146" Sep 6 00:09:35.751431 kubelet[2835]: I0906 00:09:35.751383 2835 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:35.823660 kubelet[2835]: E0906 00:09:35.823257 2835 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-26-146\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:35.823660 kubelet[2835]: I0906 00:09:35.823324 2835 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-26-146" Sep 6 00:09:35.833636 kubelet[2835]: E0906 00:09:35.833282 2835 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-26-146\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-26-146" Sep 6 00:09:35.833636 kubelet[2835]: I0906 00:09:35.833327 2835 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:35.841221 kubelet[2835]: E0906 00:09:35.841177 2835 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-26-146\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:36.223287 kubelet[2835]: I0906 00:09:36.222875 2835 apiserver.go:52] "Watching apiserver" Sep 6 00:09:36.256797 kubelet[2835]: I0906 00:09:36.256732 2835 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 6 00:09:37.832524 systemd[1]: Reloading requested from client PID 3115 ('systemctl') (unit session-7.scope)... Sep 6 00:09:37.832553 systemd[1]: Reloading... Sep 6 00:09:37.978265 zram_generator::config[3155]: No configuration found. Sep 6 00:09:38.238969 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 6 00:09:38.446249 systemd[1]: Reloading finished in 612 ms. Sep 6 00:09:38.522617 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:38.537976 systemd[1]: kubelet.service: Deactivated successfully. Sep 6 00:09:38.538568 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:38.538788 systemd[1]: kubelet.service: Consumed 1.928s CPU time, 130.9M memory peak, 0B memory swap peak. Sep 6 00:09:38.554002 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 00:09:38.882322 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 00:09:38.902742 (kubelet)[3215]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 6 00:09:38.909235 update_engine[1991]: I20250906 00:09:38.908210 1991 update_attempter.cc:509] Updating boot flags... Sep 6 00:09:39.081209 kubelet[3215]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 00:09:39.081209 kubelet[3215]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 6 00:09:39.081209 kubelet[3215]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 00:09:39.083175 kubelet[3215]: I0906 00:09:39.080360 3215 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 6 00:09:39.093189 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (3240) Sep 6 00:09:39.118132 kubelet[3215]: I0906 00:09:39.118063 3215 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 6 00:09:39.118132 kubelet[3215]: I0906 00:09:39.118120 3215 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 6 00:09:39.119783 kubelet[3215]: I0906 00:09:39.119713 3215 server.go:954] "Client rotation is on, will bootstrap in background" Sep 6 00:09:39.125452 kubelet[3215]: I0906 00:09:39.125385 3215 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 6 00:09:39.135659 kubelet[3215]: I0906 00:09:39.135068 3215 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 6 00:09:39.150828 kubelet[3215]: E0906 00:09:39.150760 3215 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 6 00:09:39.150828 kubelet[3215]: I0906 00:09:39.150823 3215 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 6 00:09:39.159657 kubelet[3215]: I0906 00:09:39.159593 3215 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 6 00:09:39.160572 kubelet[3215]: I0906 00:09:39.160089 3215 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 6 00:09:39.160572 kubelet[3215]: I0906 00:09:39.160170 3215 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-26-146","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 6 00:09:39.160776 kubelet[3215]: I0906 00:09:39.160577 3215 topology_manager.go:138] "Creating topology manager with none policy" Sep 6 00:09:39.160776 kubelet[3215]: I0906 00:09:39.160598 3215 container_manager_linux.go:304] "Creating device plugin manager" Sep 6 00:09:39.160776 kubelet[3215]: I0906 00:09:39.160678 3215 state_mem.go:36] "Initialized new in-memory state store" Sep 6 00:09:39.162663 kubelet[3215]: I0906 00:09:39.160932 3215 kubelet.go:446] "Attempting to sync node with API server" Sep 6 00:09:39.162663 kubelet[3215]: I0906 00:09:39.161792 3215 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 6 00:09:39.162663 kubelet[3215]: I0906 00:09:39.161875 3215 kubelet.go:352] "Adding apiserver pod source" Sep 6 00:09:39.162663 kubelet[3215]: I0906 00:09:39.161897 3215 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 6 00:09:39.165203 kubelet[3215]: I0906 00:09:39.164636 3215 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 6 00:09:39.173667 kubelet[3215]: I0906 00:09:39.173605 3215 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 6 00:09:39.187085 kubelet[3215]: I0906 00:09:39.186852 3215 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 6 00:09:39.187085 kubelet[3215]: I0906 00:09:39.186931 3215 server.go:1287] "Started kubelet" Sep 6 00:09:39.207226 kubelet[3215]: I0906 00:09:39.205323 3215 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 6 00:09:39.213441 kubelet[3215]: I0906 00:09:39.213358 3215 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 6 00:09:39.231207 kubelet[3215]: I0906 00:09:39.230618 3215 server.go:479] "Adding debug handlers to kubelet server" Sep 6 00:09:39.236267 kubelet[3215]: I0906 00:09:39.220971 3215 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 6 00:09:39.236392 kubelet[3215]: I0906 00:09:39.236277 3215 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 6 00:09:39.237998 kubelet[3215]: I0906 00:09:39.236650 3215 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 6 00:09:39.237998 kubelet[3215]: E0906 00:09:39.221797 3215 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-26-146\" not found" Sep 6 00:09:39.237998 kubelet[3215]: I0906 00:09:39.216019 3215 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 6 00:09:39.237998 kubelet[3215]: I0906 00:09:39.220992 3215 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 6 00:09:39.237998 kubelet[3215]: I0906 00:09:39.237048 3215 reconciler.go:26] "Reconciler: start to sync state" Sep 6 00:09:39.291722 kubelet[3215]: I0906 00:09:39.289789 3215 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 6 00:09:39.352336 kubelet[3215]: E0906 00:09:39.351784 3215 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-26-146\" not found" Sep 6 00:09:39.355447 kubelet[3215]: I0906 00:09:39.355388 3215 factory.go:221] Registration of the containerd container factory successfully Sep 6 00:09:39.355447 kubelet[3215]: I0906 00:09:39.355431 3215 factory.go:221] Registration of the systemd container factory successfully Sep 6 00:09:39.382491 kubelet[3215]: I0906 00:09:39.382184 3215 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 6 00:09:39.405384 kubelet[3215]: I0906 00:09:39.402894 3215 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 6 00:09:39.408184 kubelet[3215]: I0906 00:09:39.407328 3215 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 6 00:09:39.408184 kubelet[3215]: I0906 00:09:39.407995 3215 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 6 00:09:39.408184 kubelet[3215]: I0906 00:09:39.408030 3215 kubelet.go:2382] "Starting kubelet main sync loop" Sep 6 00:09:39.410162 kubelet[3215]: E0906 00:09:39.408784 3215 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 6 00:09:39.516521 kubelet[3215]: E0906 00:09:39.516191 3215 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 6 00:09:39.716968 kubelet[3215]: E0906 00:09:39.716903 3215 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 6 00:09:39.812561 kubelet[3215]: I0906 00:09:39.812096 3215 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 6 00:09:39.812561 kubelet[3215]: I0906 00:09:39.812129 3215 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 6 00:09:39.814825 kubelet[3215]: I0906 00:09:39.813635 3215 state_mem.go:36] "Initialized new in-memory state store" Sep 6 00:09:39.815753 kubelet[3215]: I0906 00:09:39.815695 3215 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 6 00:09:39.815884 kubelet[3215]: I0906 00:09:39.815764 3215 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 6 00:09:39.815884 kubelet[3215]: I0906 00:09:39.815807 3215 policy_none.go:49] "None policy: Start" Sep 6 00:09:39.815987 kubelet[3215]: I0906 00:09:39.815827 3215 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 6 00:09:39.815987 kubelet[3215]: I0906 00:09:39.815973 3215 state_mem.go:35] "Initializing new in-memory state store" Sep 6 00:09:39.819180 kubelet[3215]: I0906 00:09:39.818486 3215 state_mem.go:75] "Updated machine memory state" Sep 6 00:09:39.862753 kubelet[3215]: I0906 00:09:39.860005 3215 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 6 00:09:39.866645 kubelet[3215]: I0906 00:09:39.865397 3215 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 6 00:09:39.866645 kubelet[3215]: I0906 00:09:39.865451 3215 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 6 00:09:39.874367 kubelet[3215]: I0906 00:09:39.866134 3215 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 6 00:09:39.885399 kubelet[3215]: E0906 00:09:39.885328 3215 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 6 00:09:39.943314 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 37 scanned by (udev-worker) (3225) Sep 6 00:09:40.039287 kubelet[3215]: I0906 00:09:40.037995 3215 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-26-146" Sep 6 00:09:40.060080 kubelet[3215]: I0906 00:09:40.060021 3215 kubelet_node_status.go:124] "Node was previously registered" node="ip-172-31-26-146" Sep 6 00:09:40.061760 kubelet[3215]: I0906 00:09:40.060773 3215 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-26-146" Sep 6 00:09:40.118573 kubelet[3215]: I0906 00:09:40.118517 3215 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-26-146" Sep 6 00:09:40.121707 kubelet[3215]: I0906 00:09:40.121653 3215 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.124536 kubelet[3215]: I0906 00:09:40.124455 3215 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:40.157847 kubelet[3215]: I0906 00:09:40.156828 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-ca-certs\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:40.157847 kubelet[3215]: I0906 00:09:40.157008 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.158399 kubelet[3215]: I0906 00:09:40.158279 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-k8s-certs\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:40.159228 kubelet[3215]: I0906 00:09:40.158418 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8a5c9410e7c7140fb3f4768ba731efe1-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-26-146\" (UID: \"8a5c9410e7c7140fb3f4768ba731efe1\") " pod="kube-system/kube-apiserver-ip-172-31-26-146" Sep 6 00:09:40.159228 kubelet[3215]: I0906 00:09:40.158465 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-ca-certs\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.159228 kubelet[3215]: I0906 00:09:40.158500 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-k8s-certs\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.159228 kubelet[3215]: I0906 00:09:40.158537 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-kubeconfig\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.159228 kubelet[3215]: I0906 00:09:40.158578 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e7ce45773dca75e5c299ea76d927be08-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-26-146\" (UID: \"e7ce45773dca75e5c299ea76d927be08\") " pod="kube-system/kube-controller-manager-ip-172-31-26-146" Sep 6 00:09:40.159567 kubelet[3215]: I0906 00:09:40.158621 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec6e412e6a92edfbc1fab4c3b9e2d8d8-kubeconfig\") pod \"kube-scheduler-ip-172-31-26-146\" (UID: \"ec6e412e6a92edfbc1fab4c3b9e2d8d8\") " pod="kube-system/kube-scheduler-ip-172-31-26-146" Sep 6 00:09:40.165249 kubelet[3215]: I0906 00:09:40.163814 3215 apiserver.go:52] "Watching apiserver" Sep 6 00:09:40.238893 kubelet[3215]: I0906 00:09:40.238833 3215 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 6 00:09:40.515537 kubelet[3215]: I0906 00:09:40.515308 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-26-146" podStartSLOduration=0.515283445 podStartE2EDuration="515.283445ms" podCreationTimestamp="2025-09-06 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:09:40.497187301 +0000 UTC m=+1.581229989" watchObservedRunningTime="2025-09-06 00:09:40.515283445 +0000 UTC m=+1.599326109" Sep 6 00:09:40.532177 kubelet[3215]: I0906 00:09:40.530936 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-26-146" podStartSLOduration=0.530916013 podStartE2EDuration="530.916013ms" podCreationTimestamp="2025-09-06 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:09:40.516791389 +0000 UTC m=+1.600834089" watchObservedRunningTime="2025-09-06 00:09:40.530916013 +0000 UTC m=+1.614958689" Sep 6 00:09:40.571936 kubelet[3215]: I0906 00:09:40.571816 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-26-146" podStartSLOduration=0.571770109 podStartE2EDuration="571.770109ms" podCreationTimestamp="2025-09-06 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:09:40.532546021 +0000 UTC m=+1.616588685" watchObservedRunningTime="2025-09-06 00:09:40.571770109 +0000 UTC m=+1.655812785" Sep 6 00:09:44.105320 kubelet[3215]: I0906 00:09:44.105260 3215 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 6 00:09:44.105931 containerd[2004]: time="2025-09-06T00:09:44.105786303Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 6 00:09:44.106436 kubelet[3215]: I0906 00:09:44.106357 3215 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 6 00:09:45.063567 systemd[1]: Created slice kubepods-besteffort-pod407db96b_4e6e_4bf5_a365_e2cf61cef3b7.slice - libcontainer container kubepods-besteffort-pod407db96b_4e6e_4bf5_a365_e2cf61cef3b7.slice. Sep 6 00:09:45.096725 kubelet[3215]: I0906 00:09:45.096628 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/407db96b-4e6e-4bf5-a365-e2cf61cef3b7-xtables-lock\") pod \"kube-proxy-qwrdp\" (UID: \"407db96b-4e6e-4bf5-a365-e2cf61cef3b7\") " pod="kube-system/kube-proxy-qwrdp" Sep 6 00:09:45.097040 kubelet[3215]: I0906 00:09:45.096766 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/407db96b-4e6e-4bf5-a365-e2cf61cef3b7-kube-proxy\") pod \"kube-proxy-qwrdp\" (UID: \"407db96b-4e6e-4bf5-a365-e2cf61cef3b7\") " pod="kube-system/kube-proxy-qwrdp" Sep 6 00:09:45.097040 kubelet[3215]: I0906 00:09:45.096820 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/407db96b-4e6e-4bf5-a365-e2cf61cef3b7-lib-modules\") pod \"kube-proxy-qwrdp\" (UID: \"407db96b-4e6e-4bf5-a365-e2cf61cef3b7\") " pod="kube-system/kube-proxy-qwrdp" Sep 6 00:09:45.097040 kubelet[3215]: I0906 00:09:45.096865 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfdns\" (UniqueName: \"kubernetes.io/projected/407db96b-4e6e-4bf5-a365-e2cf61cef3b7-kube-api-access-tfdns\") pod \"kube-proxy-qwrdp\" (UID: \"407db96b-4e6e-4bf5-a365-e2cf61cef3b7\") " pod="kube-system/kube-proxy-qwrdp" Sep 6 00:09:45.200441 systemd[1]: Created slice kubepods-besteffort-pod6760618e_ab62_4b6e_93ea_2e4c636d64b7.slice - libcontainer container kubepods-besteffort-pod6760618e_ab62_4b6e_93ea_2e4c636d64b7.slice. Sep 6 00:09:45.298269 kubelet[3215]: I0906 00:09:45.298221 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2lh\" (UniqueName: \"kubernetes.io/projected/6760618e-ab62-4b6e-93ea-2e4c636d64b7-kube-api-access-7q2lh\") pod \"tigera-operator-755d956888-sqtsm\" (UID: \"6760618e-ab62-4b6e-93ea-2e4c636d64b7\") " pod="tigera-operator/tigera-operator-755d956888-sqtsm" Sep 6 00:09:45.299013 kubelet[3215]: I0906 00:09:45.298961 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6760618e-ab62-4b6e-93ea-2e4c636d64b7-var-lib-calico\") pod \"tigera-operator-755d956888-sqtsm\" (UID: \"6760618e-ab62-4b6e-93ea-2e4c636d64b7\") " pod="tigera-operator/tigera-operator-755d956888-sqtsm" Sep 6 00:09:45.377678 containerd[2004]: time="2025-09-06T00:09:45.377539997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qwrdp,Uid:407db96b-4e6e-4bf5-a365-e2cf61cef3b7,Namespace:kube-system,Attempt:0,}" Sep 6 00:09:45.448985 containerd[2004]: time="2025-09-06T00:09:45.448079933Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:09:45.448985 containerd[2004]: time="2025-09-06T00:09:45.448271069Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:09:45.448985 containerd[2004]: time="2025-09-06T00:09:45.448310525Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:45.448985 containerd[2004]: time="2025-09-06T00:09:45.448488425Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:45.506480 systemd[1]: Started cri-containerd-3d2ecb6a9029e875357b8470e65b107fbb7255dab3a287d56615feef099fd904.scope - libcontainer container 3d2ecb6a9029e875357b8470e65b107fbb7255dab3a287d56615feef099fd904. Sep 6 00:09:45.511938 containerd[2004]: time="2025-09-06T00:09:45.511887498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-sqtsm,Uid:6760618e-ab62-4b6e-93ea-2e4c636d64b7,Namespace:tigera-operator,Attempt:0,}" Sep 6 00:09:45.568369 containerd[2004]: time="2025-09-06T00:09:45.568218234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qwrdp,Uid:407db96b-4e6e-4bf5-a365-e2cf61cef3b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"3d2ecb6a9029e875357b8470e65b107fbb7255dab3a287d56615feef099fd904\"" Sep 6 00:09:45.582527 containerd[2004]: time="2025-09-06T00:09:45.582464838Z" level=info msg="CreateContainer within sandbox \"3d2ecb6a9029e875357b8470e65b107fbb7255dab3a287d56615feef099fd904\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 6 00:09:45.587452 containerd[2004]: time="2025-09-06T00:09:45.586850766Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:09:45.587452 containerd[2004]: time="2025-09-06T00:09:45.586943778Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:09:45.587452 containerd[2004]: time="2025-09-06T00:09:45.586987230Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:45.587452 containerd[2004]: time="2025-09-06T00:09:45.587180778Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:09:45.642517 systemd[1]: Started cri-containerd-07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f.scope - libcontainer container 07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f. Sep 6 00:09:45.648162 containerd[2004]: time="2025-09-06T00:09:45.648085806Z" level=info msg="CreateContainer within sandbox \"3d2ecb6a9029e875357b8470e65b107fbb7255dab3a287d56615feef099fd904\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"323afe6c9324384072cb492eead82c27d65caf1d60570675c7e7c94d14039fcc\"" Sep 6 00:09:45.649931 containerd[2004]: time="2025-09-06T00:09:45.649882014Z" level=info msg="StartContainer for \"323afe6c9324384072cb492eead82c27d65caf1d60570675c7e7c94d14039fcc\"" Sep 6 00:09:45.711425 systemd[1]: Started cri-containerd-323afe6c9324384072cb492eead82c27d65caf1d60570675c7e7c94d14039fcc.scope - libcontainer container 323afe6c9324384072cb492eead82c27d65caf1d60570675c7e7c94d14039fcc. Sep 6 00:09:45.748412 containerd[2004]: time="2025-09-06T00:09:45.748361371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-sqtsm,Uid:6760618e-ab62-4b6e-93ea-2e4c636d64b7,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\"" Sep 6 00:09:45.752922 containerd[2004]: time="2025-09-06T00:09:45.752621875Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 6 00:09:45.791293 containerd[2004]: time="2025-09-06T00:09:45.790865647Z" level=info msg="StartContainer for \"323afe6c9324384072cb492eead82c27d65caf1d60570675c7e7c94d14039fcc\" returns successfully" Sep 6 00:09:47.167988 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4275294841.mount: Deactivated successfully. Sep 6 00:09:47.947881 containerd[2004]: time="2025-09-06T00:09:47.947811406Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:47.949846 containerd[2004]: time="2025-09-06T00:09:47.949636858Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 6 00:09:47.950838 containerd[2004]: time="2025-09-06T00:09:47.950776450Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:47.956091 containerd[2004]: time="2025-09-06T00:09:47.955982062Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:09:47.958022 containerd[2004]: time="2025-09-06T00:09:47.957959962Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.205275327s" Sep 6 00:09:47.958365 containerd[2004]: time="2025-09-06T00:09:47.958205962Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 6 00:09:47.963338 containerd[2004]: time="2025-09-06T00:09:47.963215422Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 6 00:09:47.983703 containerd[2004]: time="2025-09-06T00:09:47.983457538Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794\"" Sep 6 00:09:47.984321 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3036823875.mount: Deactivated successfully. Sep 6 00:09:47.986195 containerd[2004]: time="2025-09-06T00:09:47.984894046Z" level=info msg="StartContainer for \"2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794\"" Sep 6 00:09:48.042678 systemd[1]: run-containerd-runc-k8s.io-2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794-runc.zx0fuA.mount: Deactivated successfully. Sep 6 00:09:48.053499 systemd[1]: Started cri-containerd-2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794.scope - libcontainer container 2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794. Sep 6 00:09:48.098271 containerd[2004]: time="2025-09-06T00:09:48.097716402Z" level=info msg="StartContainer for \"2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794\" returns successfully" Sep 6 00:09:48.623245 kubelet[3215]: I0906 00:09:48.623106 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qwrdp" podStartSLOduration=3.6230563890000003 podStartE2EDuration="3.623056389s" podCreationTimestamp="2025-09-06 00:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:09:46.617310583 +0000 UTC m=+7.701353295" watchObservedRunningTime="2025-09-06 00:09:48.623056389 +0000 UTC m=+9.707099065" Sep 6 00:09:48.623829 kubelet[3215]: I0906 00:09:48.623355 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-sqtsm" podStartSLOduration=1.41541633 podStartE2EDuration="3.623345061s" podCreationTimestamp="2025-09-06 00:09:45 +0000 UTC" firstStartedPulling="2025-09-06 00:09:45.751898635 +0000 UTC m=+6.835941311" lastFinishedPulling="2025-09-06 00:09:47.959827378 +0000 UTC m=+9.043870042" observedRunningTime="2025-09-06 00:09:48.621848577 +0000 UTC m=+9.705891325" watchObservedRunningTime="2025-09-06 00:09:48.623345061 +0000 UTC m=+9.707387737" Sep 6 00:09:52.834590 systemd[1]: cri-containerd-2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794.scope: Deactivated successfully. Sep 6 00:09:52.883787 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794-rootfs.mount: Deactivated successfully. Sep 6 00:09:53.043171 containerd[2004]: time="2025-09-06T00:09:53.043068407Z" level=info msg="shim disconnected" id=2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794 namespace=k8s.io Sep 6 00:09:53.043171 containerd[2004]: time="2025-09-06T00:09:53.043168475Z" level=warning msg="cleaning up after shim disconnected" id=2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794 namespace=k8s.io Sep 6 00:09:53.043972 containerd[2004]: time="2025-09-06T00:09:53.043191839Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:09:53.635287 kubelet[3215]: I0906 00:09:53.635229 3215 scope.go:117] "RemoveContainer" containerID="2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794" Sep 6 00:09:53.643996 containerd[2004]: time="2025-09-06T00:09:53.643908662Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 6 00:09:53.687877 containerd[2004]: time="2025-09-06T00:09:53.687719126Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53\"" Sep 6 00:09:53.689117 containerd[2004]: time="2025-09-06T00:09:53.689067686Z" level=info msg="StartContainer for \"103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53\"" Sep 6 00:09:53.769506 systemd[1]: Started cri-containerd-103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53.scope - libcontainer container 103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53. Sep 6 00:09:53.964640 containerd[2004]: time="2025-09-06T00:09:53.964346644Z" level=info msg="StartContainer for \"103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53\" returns successfully" Sep 6 00:09:56.947757 sudo[2333]: pam_unix(sudo:session): session closed for user root Sep 6 00:09:56.972511 sshd[2330]: pam_unix(sshd:session): session closed for user core Sep 6 00:09:56.979826 systemd[1]: sshd@6-172.31.26.146:22-139.178.68.195:38138.service: Deactivated successfully. Sep 6 00:09:56.985123 systemd[1]: session-7.scope: Deactivated successfully. Sep 6 00:09:56.985554 systemd[1]: session-7.scope: Consumed 10.531s CPU time, 150.8M memory peak, 0B memory swap peak. Sep 6 00:09:56.986963 systemd-logind[1990]: Session 7 logged out. Waiting for processes to exit. Sep 6 00:09:56.989892 systemd-logind[1990]: Removed session 7. Sep 6 00:10:10.895848 systemd[1]: Created slice kubepods-besteffort-pod38baa2eb_6b47_4b8a_860c_559bbd4f6a3c.slice - libcontainer container kubepods-besteffort-pod38baa2eb_6b47_4b8a_860c_559bbd4f6a3c.slice. Sep 6 00:10:10.966869 kubelet[3215]: I0906 00:10:10.966595 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38baa2eb-6b47-4b8a-860c-559bbd4f6a3c-tigera-ca-bundle\") pod \"calico-typha-f8887c4f5-n6rbx\" (UID: \"38baa2eb-6b47-4b8a-860c-559bbd4f6a3c\") " pod="calico-system/calico-typha-f8887c4f5-n6rbx" Sep 6 00:10:10.966869 kubelet[3215]: I0906 00:10:10.966670 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5j6\" (UniqueName: \"kubernetes.io/projected/38baa2eb-6b47-4b8a-860c-559bbd4f6a3c-kube-api-access-xn5j6\") pod \"calico-typha-f8887c4f5-n6rbx\" (UID: \"38baa2eb-6b47-4b8a-860c-559bbd4f6a3c\") " pod="calico-system/calico-typha-f8887c4f5-n6rbx" Sep 6 00:10:10.966869 kubelet[3215]: I0906 00:10:10.966720 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/38baa2eb-6b47-4b8a-860c-559bbd4f6a3c-typha-certs\") pod \"calico-typha-f8887c4f5-n6rbx\" (UID: \"38baa2eb-6b47-4b8a-860c-559bbd4f6a3c\") " pod="calico-system/calico-typha-f8887c4f5-n6rbx" Sep 6 00:10:11.208050 containerd[2004]: time="2025-09-06T00:10:11.207461861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f8887c4f5-n6rbx,Uid:38baa2eb-6b47-4b8a-860c-559bbd4f6a3c,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:11.286321 containerd[2004]: time="2025-09-06T00:10:11.279203466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:11.286321 containerd[2004]: time="2025-09-06T00:10:11.285113778Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:11.286321 containerd[2004]: time="2025-09-06T00:10:11.285175482Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:11.291560 containerd[2004]: time="2025-09-06T00:10:11.289326222Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:11.302030 systemd[1]: Created slice kubepods-besteffort-pod4691c9f4_bb0a_4d23_8115_8bd51f5a8f86.slice - libcontainer container kubepods-besteffort-pod4691c9f4_bb0a_4d23_8115_8bd51f5a8f86.slice. Sep 6 00:10:11.371278 kubelet[3215]: I0906 00:10:11.370532 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97lt\" (UniqueName: \"kubernetes.io/projected/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-kube-api-access-q97lt\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.372134 systemd[1]: Started cri-containerd-f2dc6c2f5b0dd0116eec7f2ea85e6d8a174979e68cf3cbbfb4a2eb5bb10d9946.scope - libcontainer container f2dc6c2f5b0dd0116eec7f2ea85e6d8a174979e68cf3cbbfb4a2eb5bb10d9946. Sep 6 00:10:11.375854 kubelet[3215]: I0906 00:10:11.374568 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-cni-bin-dir\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.378199 kubelet[3215]: I0906 00:10:11.377439 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-cni-log-dir\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.378199 kubelet[3215]: I0906 00:10:11.377623 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-cni-net-dir\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.378199 kubelet[3215]: I0906 00:10:11.377683 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-flexvol-driver-host\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.378199 kubelet[3215]: I0906 00:10:11.377725 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-lib-modules\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.378199 kubelet[3215]: I0906 00:10:11.377765 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-node-certs\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.380857 kubelet[3215]: I0906 00:10:11.377812 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-tigera-ca-bundle\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.380857 kubelet[3215]: I0906 00:10:11.380472 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-var-lib-calico\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.380857 kubelet[3215]: I0906 00:10:11.380529 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-var-run-calico\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.380857 kubelet[3215]: I0906 00:10:11.380564 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-xtables-lock\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.380857 kubelet[3215]: I0906 00:10:11.380610 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4691c9f4-bb0a-4d23-8115-8bd51f5a8f86-policysync\") pod \"calico-node-gt6tt\" (UID: \"4691c9f4-bb0a-4d23-8115-8bd51f5a8f86\") " pod="calico-system/calico-node-gt6tt" Sep 6 00:10:11.503383 kubelet[3215]: E0906 00:10:11.503327 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.505632 kubelet[3215]: W0906 00:10:11.503370 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.505632 kubelet[3215]: E0906 00:10:11.505283 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.516185 kubelet[3215]: E0906 00:10:11.514260 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.516185 kubelet[3215]: W0906 00:10:11.514302 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.516185 kubelet[3215]: E0906 00:10:11.514335 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.523515 kubelet[3215]: E0906 00:10:11.523353 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.525874 kubelet[3215]: W0906 00:10:11.525829 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.526372 kubelet[3215]: E0906 00:10:11.526249 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.560795 containerd[2004]: time="2025-09-06T00:10:11.560431819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f8887c4f5-n6rbx,Uid:38baa2eb-6b47-4b8a-860c-559bbd4f6a3c,Namespace:calico-system,Attempt:0,} returns sandbox id \"f2dc6c2f5b0dd0116eec7f2ea85e6d8a174979e68cf3cbbfb4a2eb5bb10d9946\"" Sep 6 00:10:11.566811 containerd[2004]: time="2025-09-06T00:10:11.566731999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 6 00:10:11.612393 kubelet[3215]: E0906 00:10:11.612317 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:11.619121 kubelet[3215]: E0906 00:10:11.618705 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.619121 kubelet[3215]: W0906 00:10:11.618742 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.619121 kubelet[3215]: E0906 00:10:11.618777 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.620225 kubelet[3215]: E0906 00:10:11.620189 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.621211 kubelet[3215]: W0906 00:10:11.620414 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.621211 kubelet[3215]: E0906 00:10:11.620750 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.622521 kubelet[3215]: E0906 00:10:11.622230 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.622521 kubelet[3215]: W0906 00:10:11.622265 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.622521 kubelet[3215]: E0906 00:10:11.622313 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.624948 kubelet[3215]: E0906 00:10:11.623763 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.624948 kubelet[3215]: W0906 00:10:11.623799 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.624948 kubelet[3215]: E0906 00:10:11.623831 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.626199 kubelet[3215]: E0906 00:10:11.625775 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.626199 kubelet[3215]: W0906 00:10:11.625809 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.626199 kubelet[3215]: E0906 00:10:11.625855 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.626878 kubelet[3215]: E0906 00:10:11.626635 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.626878 kubelet[3215]: W0906 00:10:11.626666 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.626878 kubelet[3215]: E0906 00:10:11.626694 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.628396 kubelet[3215]: E0906 00:10:11.628117 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.628396 kubelet[3215]: W0906 00:10:11.628196 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.628396 kubelet[3215]: E0906 00:10:11.628229 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.629581 kubelet[3215]: E0906 00:10:11.629077 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.629581 kubelet[3215]: W0906 00:10:11.629109 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.629581 kubelet[3215]: E0906 00:10:11.629199 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.630953 kubelet[3215]: E0906 00:10:11.630534 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.630953 kubelet[3215]: W0906 00:10:11.630567 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.630953 kubelet[3215]: E0906 00:10:11.630599 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.632506 kubelet[3215]: E0906 00:10:11.632262 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.632506 kubelet[3215]: W0906 00:10:11.632295 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.632506 kubelet[3215]: E0906 00:10:11.632324 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.632837 kubelet[3215]: E0906 00:10:11.632813 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.632961 kubelet[3215]: W0906 00:10:11.632935 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.633076 kubelet[3215]: E0906 00:10:11.633051 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.635216 kubelet[3215]: E0906 00:10:11.635097 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.635216 kubelet[3215]: W0906 00:10:11.635132 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.635654 kubelet[3215]: E0906 00:10:11.635438 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.635997 kubelet[3215]: E0906 00:10:11.635972 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.636419 kubelet[3215]: W0906 00:10:11.636211 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.636419 kubelet[3215]: E0906 00:10:11.636248 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.638417 kubelet[3215]: E0906 00:10:11.637981 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.638417 kubelet[3215]: W0906 00:10:11.638013 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.638417 kubelet[3215]: E0906 00:10:11.638044 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.639350 kubelet[3215]: E0906 00:10:11.639319 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.639537 kubelet[3215]: W0906 00:10:11.639507 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.639653 kubelet[3215]: E0906 00:10:11.639630 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.642034 kubelet[3215]: E0906 00:10:11.640309 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.642034 kubelet[3215]: W0906 00:10:11.640340 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.642034 kubelet[3215]: E0906 00:10:11.640386 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.642330 kubelet[3215]: E0906 00:10:11.642042 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.642330 kubelet[3215]: W0906 00:10:11.642072 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.642330 kubelet[3215]: E0906 00:10:11.642107 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.643209 containerd[2004]: time="2025-09-06T00:10:11.642818827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gt6tt,Uid:4691c9f4-bb0a-4d23-8115-8bd51f5a8f86,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:11.644878 kubelet[3215]: E0906 00:10:11.644827 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.644878 kubelet[3215]: W0906 00:10:11.644865 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.645693 kubelet[3215]: E0906 00:10:11.644898 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.647424 kubelet[3215]: E0906 00:10:11.646759 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.647424 kubelet[3215]: W0906 00:10:11.647004 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.647424 kubelet[3215]: E0906 00:10:11.647039 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.650363 kubelet[3215]: E0906 00:10:11.648336 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.650363 kubelet[3215]: W0906 00:10:11.648375 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.650363 kubelet[3215]: E0906 00:10:11.648406 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.685171 kubelet[3215]: E0906 00:10:11.684107 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.685171 kubelet[3215]: W0906 00:10:11.684163 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.685171 kubelet[3215]: E0906 00:10:11.684199 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.685171 kubelet[3215]: I0906 00:10:11.684255 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/44f64470-7851-42aa-8ed3-eb71e8151f7c-registration-dir\") pod \"csi-node-driver-47vdk\" (UID: \"44f64470-7851-42aa-8ed3-eb71e8151f7c\") " pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:11.686584 kubelet[3215]: E0906 00:10:11.686529 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.687298 kubelet[3215]: W0906 00:10:11.687253 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.688202 kubelet[3215]: E0906 00:10:11.687493 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.688202 kubelet[3215]: I0906 00:10:11.687546 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfqm\" (UniqueName: \"kubernetes.io/projected/44f64470-7851-42aa-8ed3-eb71e8151f7c-kube-api-access-znfqm\") pod \"csi-node-driver-47vdk\" (UID: \"44f64470-7851-42aa-8ed3-eb71e8151f7c\") " pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:11.688532 kubelet[3215]: E0906 00:10:11.688490 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.688612 kubelet[3215]: W0906 00:10:11.688537 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.688612 kubelet[3215]: E0906 00:10:11.688581 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.691603 kubelet[3215]: E0906 00:10:11.691551 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.691603 kubelet[3215]: W0906 00:10:11.691594 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.691911 kubelet[3215]: E0906 00:10:11.691851 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.693857 kubelet[3215]: E0906 00:10:11.693741 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.693857 kubelet[3215]: W0906 00:10:11.693775 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.694048 kubelet[3215]: E0906 00:10:11.693898 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.694048 kubelet[3215]: I0906 00:10:11.693947 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44f64470-7851-42aa-8ed3-eb71e8151f7c-kubelet-dir\") pod \"csi-node-driver-47vdk\" (UID: \"44f64470-7851-42aa-8ed3-eb71e8151f7c\") " pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:11.696823 kubelet[3215]: E0906 00:10:11.696336 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.696823 kubelet[3215]: W0906 00:10:11.696376 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.696823 kubelet[3215]: E0906 00:10:11.696424 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.698713 kubelet[3215]: E0906 00:10:11.698367 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.698713 kubelet[3215]: W0906 00:10:11.698409 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.699746 kubelet[3215]: E0906 00:10:11.698638 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.701836 kubelet[3215]: E0906 00:10:11.701496 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.701836 kubelet[3215]: W0906 00:10:11.701530 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.701836 kubelet[3215]: E0906 00:10:11.701673 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.701836 kubelet[3215]: I0906 00:10:11.701722 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/44f64470-7851-42aa-8ed3-eb71e8151f7c-socket-dir\") pod \"csi-node-driver-47vdk\" (UID: \"44f64470-7851-42aa-8ed3-eb71e8151f7c\") " pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:11.705211 kubelet[3215]: E0906 00:10:11.704801 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.705211 kubelet[3215]: W0906 00:10:11.704837 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.705211 kubelet[3215]: E0906 00:10:11.704884 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.705947 kubelet[3215]: E0906 00:10:11.705685 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.706304 kubelet[3215]: W0906 00:10:11.705718 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.707579 kubelet[3215]: E0906 00:10:11.707239 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.707897 kubelet[3215]: E0906 00:10:11.707855 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.708005 kubelet[3215]: W0906 00:10:11.707893 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.708181 kubelet[3215]: E0906 00:10:11.708069 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.708181 kubelet[3215]: I0906 00:10:11.708127 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/44f64470-7851-42aa-8ed3-eb71e8151f7c-varrun\") pod \"csi-node-driver-47vdk\" (UID: \"44f64470-7851-42aa-8ed3-eb71e8151f7c\") " pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:11.711650 kubelet[3215]: E0906 00:10:11.711421 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.711650 kubelet[3215]: W0906 00:10:11.711458 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.711650 kubelet[3215]: E0906 00:10:11.711505 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.714281 containerd[2004]: time="2025-09-06T00:10:11.713771492Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:11.714281 containerd[2004]: time="2025-09-06T00:10:11.713876780Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:11.714281 containerd[2004]: time="2025-09-06T00:10:11.713973536Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:11.714682 kubelet[3215]: E0906 00:10:11.714188 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.714682 kubelet[3215]: W0906 00:10:11.714215 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.714682 kubelet[3215]: E0906 00:10:11.714337 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.714865 kubelet[3215]: E0906 00:10:11.714727 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.714865 kubelet[3215]: W0906 00:10:11.714746 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.714865 kubelet[3215]: E0906 00:10:11.714771 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.716722 kubelet[3215]: E0906 00:10:11.716091 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.716722 kubelet[3215]: W0906 00:10:11.716129 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.716722 kubelet[3215]: E0906 00:10:11.716189 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.717400 containerd[2004]: time="2025-09-06T00:10:11.715899620Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:11.766233 systemd[1]: Started cri-containerd-1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85.scope - libcontainer container 1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85. Sep 6 00:10:11.809566 kubelet[3215]: E0906 00:10:11.809510 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.809712 kubelet[3215]: W0906 00:10:11.809550 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.809712 kubelet[3215]: E0906 00:10:11.809606 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.811181 kubelet[3215]: E0906 00:10:11.810188 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.811181 kubelet[3215]: W0906 00:10:11.810222 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.811181 kubelet[3215]: E0906 00:10:11.810376 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.811856 kubelet[3215]: E0906 00:10:11.811583 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.811856 kubelet[3215]: W0906 00:10:11.811616 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.811856 kubelet[3215]: E0906 00:10:11.811663 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.813297 kubelet[3215]: E0906 00:10:11.813241 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.813684 kubelet[3215]: W0906 00:10:11.813454 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.813684 kubelet[3215]: E0906 00:10:11.813542 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.815193 kubelet[3215]: E0906 00:10:11.814294 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.815193 kubelet[3215]: W0906 00:10:11.814324 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.815193 kubelet[3215]: E0906 00:10:11.814383 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.815561 kubelet[3215]: E0906 00:10:11.815535 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.816081 kubelet[3215]: W0906 00:10:11.816051 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.817199 kubelet[3215]: E0906 00:10:11.816401 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.820190 kubelet[3215]: E0906 00:10:11.818288 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.820190 kubelet[3215]: W0906 00:10:11.818323 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.820190 kubelet[3215]: E0906 00:10:11.818387 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.820823 kubelet[3215]: E0906 00:10:11.820590 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.820823 kubelet[3215]: W0906 00:10:11.820620 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.820823 kubelet[3215]: E0906 00:10:11.820682 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.821316 kubelet[3215]: E0906 00:10:11.821287 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.821463 kubelet[3215]: W0906 00:10:11.821436 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.821617 kubelet[3215]: E0906 00:10:11.821580 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.822921 kubelet[3215]: E0906 00:10:11.822577 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.823248 kubelet[3215]: W0906 00:10:11.823067 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.823319 kubelet[3215]: E0906 00:10:11.823248 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.824660 kubelet[3215]: E0906 00:10:11.824622 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.825039 kubelet[3215]: W0906 00:10:11.824836 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.825039 kubelet[3215]: E0906 00:10:11.824912 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.825875 kubelet[3215]: E0906 00:10:11.825846 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.826015 kubelet[3215]: W0906 00:10:11.825988 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.826874 kubelet[3215]: E0906 00:10:11.826825 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.827698 kubelet[3215]: E0906 00:10:11.827381 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.827698 kubelet[3215]: W0906 00:10:11.827408 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.827698 kubelet[3215]: E0906 00:10:11.827462 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.829203 kubelet[3215]: E0906 00:10:11.828279 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.829203 kubelet[3215]: W0906 00:10:11.828307 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.829203 kubelet[3215]: E0906 00:10:11.828365 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.829565 kubelet[3215]: E0906 00:10:11.829536 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.830311 kubelet[3215]: W0906 00:10:11.829658 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.830311 kubelet[3215]: E0906 00:10:11.829727 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.831209 kubelet[3215]: E0906 00:10:11.830632 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.831394 kubelet[3215]: W0906 00:10:11.831356 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.831877 kubelet[3215]: E0906 00:10:11.831559 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.832571 kubelet[3215]: E0906 00:10:11.832066 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.833307 kubelet[3215]: W0906 00:10:11.832742 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.833307 kubelet[3215]: E0906 00:10:11.832827 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.834522 kubelet[3215]: E0906 00:10:11.833652 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.834735 kubelet[3215]: W0906 00:10:11.834697 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.835134 kubelet[3215]: E0906 00:10:11.834864 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.837207 kubelet[3215]: E0906 00:10:11.835597 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.837207 kubelet[3215]: W0906 00:10:11.835629 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.837207 kubelet[3215]: E0906 00:10:11.835708 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.837646 kubelet[3215]: E0906 00:10:11.837613 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.837780 kubelet[3215]: W0906 00:10:11.837753 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.837964 kubelet[3215]: E0906 00:10:11.837922 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.839200 kubelet[3215]: E0906 00:10:11.838458 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.839200 kubelet[3215]: W0906 00:10:11.838486 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.839200 kubelet[3215]: E0906 00:10:11.838546 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.840391 kubelet[3215]: E0906 00:10:11.840353 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.840645 kubelet[3215]: W0906 00:10:11.840549 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.841310 kubelet[3215]: E0906 00:10:11.840942 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.843210 kubelet[3215]: E0906 00:10:11.842174 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.843210 kubelet[3215]: W0906 00:10:11.842209 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.843210 kubelet[3215]: E0906 00:10:11.842274 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.844249 kubelet[3215]: E0906 00:10:11.844213 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.845098 kubelet[3215]: W0906 00:10:11.844415 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.845098 kubelet[3215]: E0906 00:10:11.844472 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.846349 kubelet[3215]: E0906 00:10:11.846308 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.846534 kubelet[3215]: W0906 00:10:11.846506 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.846724 kubelet[3215]: E0906 00:10:11.846662 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:11.901874 containerd[2004]: time="2025-09-06T00:10:11.901574397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gt6tt,Uid:4691c9f4-bb0a-4d23-8115-8bd51f5a8f86,Namespace:calico-system,Attempt:0,} returns sandbox id \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\"" Sep 6 00:10:11.905862 kubelet[3215]: E0906 00:10:11.905526 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:11.905862 kubelet[3215]: W0906 00:10:11.905749 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:11.905862 kubelet[3215]: E0906 00:10:11.905784 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:13.245050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2565519126.mount: Deactivated successfully. Sep 6 00:10:13.409897 kubelet[3215]: E0906 00:10:13.409410 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:14.750871 containerd[2004]: time="2025-09-06T00:10:14.750791111Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:14.753593 containerd[2004]: time="2025-09-06T00:10:14.753530423Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 6 00:10:14.756366 containerd[2004]: time="2025-09-06T00:10:14.756265163Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:14.762278 containerd[2004]: time="2025-09-06T00:10:14.762215627Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:14.763582 containerd[2004]: time="2025-09-06T00:10:14.763518179Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 3.196708036s" Sep 6 00:10:14.763671 containerd[2004]: time="2025-09-06T00:10:14.763579811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 6 00:10:14.767231 containerd[2004]: time="2025-09-06T00:10:14.767063231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 6 00:10:14.792982 containerd[2004]: time="2025-09-06T00:10:14.792925319Z" level=info msg="CreateContainer within sandbox \"f2dc6c2f5b0dd0116eec7f2ea85e6d8a174979e68cf3cbbfb4a2eb5bb10d9946\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 6 00:10:14.825183 containerd[2004]: time="2025-09-06T00:10:14.824967179Z" level=info msg="CreateContainer within sandbox \"f2dc6c2f5b0dd0116eec7f2ea85e6d8a174979e68cf3cbbfb4a2eb5bb10d9946\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"871b79d83317cc503580733b99784d1c8a662d69f9c4e183a08f35a505b107d9\"" Sep 6 00:10:14.827530 containerd[2004]: time="2025-09-06T00:10:14.826336967Z" level=info msg="StartContainer for \"871b79d83317cc503580733b99784d1c8a662d69f9c4e183a08f35a505b107d9\"" Sep 6 00:10:14.883918 systemd[1]: Started cri-containerd-871b79d83317cc503580733b99784d1c8a662d69f9c4e183a08f35a505b107d9.scope - libcontainer container 871b79d83317cc503580733b99784d1c8a662d69f9c4e183a08f35a505b107d9. Sep 6 00:10:14.960110 containerd[2004]: time="2025-09-06T00:10:14.959922132Z" level=info msg="StartContainer for \"871b79d83317cc503580733b99784d1c8a662d69f9c4e183a08f35a505b107d9\" returns successfully" Sep 6 00:10:15.417321 kubelet[3215]: E0906 00:10:15.415594 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:15.782191 kubelet[3215]: I0906 00:10:15.780745 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f8887c4f5-n6rbx" podStartSLOduration=2.581198384 podStartE2EDuration="5.780720432s" podCreationTimestamp="2025-09-06 00:10:10 +0000 UTC" firstStartedPulling="2025-09-06 00:10:11.565877647 +0000 UTC m=+32.649920311" lastFinishedPulling="2025-09-06 00:10:14.765399695 +0000 UTC m=+35.849442359" observedRunningTime="2025-09-06 00:10:15.751824864 +0000 UTC m=+36.835867540" watchObservedRunningTime="2025-09-06 00:10:15.780720432 +0000 UTC m=+36.864763108" Sep 6 00:10:15.785846 kubelet[3215]: E0906 00:10:15.785484 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.785846 kubelet[3215]: W0906 00:10:15.785755 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.787551 kubelet[3215]: E0906 00:10:15.785792 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.789587 kubelet[3215]: E0906 00:10:15.789427 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.789587 kubelet[3215]: W0906 00:10:15.789463 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.789587 kubelet[3215]: E0906 00:10:15.789520 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.791179 kubelet[3215]: E0906 00:10:15.790910 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.791179 kubelet[3215]: W0906 00:10:15.790952 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.791179 kubelet[3215]: E0906 00:10:15.791029 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.793272 kubelet[3215]: E0906 00:10:15.793237 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.794304 kubelet[3215]: W0906 00:10:15.793626 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.794304 kubelet[3215]: E0906 00:10:15.794210 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.796569 kubelet[3215]: E0906 00:10:15.796417 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.796569 kubelet[3215]: W0906 00:10:15.796449 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.796569 kubelet[3215]: E0906 00:10:15.796481 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.797623 kubelet[3215]: E0906 00:10:15.797227 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.797623 kubelet[3215]: W0906 00:10:15.797256 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.797623 kubelet[3215]: E0906 00:10:15.797284 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.798752 kubelet[3215]: E0906 00:10:15.798381 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.798752 kubelet[3215]: W0906 00:10:15.798412 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.798752 kubelet[3215]: E0906 00:10:15.798449 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.800583 kubelet[3215]: E0906 00:10:15.800396 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.800583 kubelet[3215]: W0906 00:10:15.800431 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.800583 kubelet[3215]: E0906 00:10:15.800465 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.801344 kubelet[3215]: E0906 00:10:15.801211 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.801344 kubelet[3215]: W0906 00:10:15.801238 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.801344 kubelet[3215]: E0906 00:10:15.801266 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.802255 kubelet[3215]: E0906 00:10:15.802064 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.802255 kubelet[3215]: W0906 00:10:15.802094 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.802255 kubelet[3215]: E0906 00:10:15.802123 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.803562 kubelet[3215]: E0906 00:10:15.803295 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.803562 kubelet[3215]: W0906 00:10:15.803328 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.803562 kubelet[3215]: E0906 00:10:15.803358 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.804533 kubelet[3215]: E0906 00:10:15.804396 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.804859 kubelet[3215]: W0906 00:10:15.804689 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.804859 kubelet[3215]: E0906 00:10:15.804728 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.806202 kubelet[3215]: E0906 00:10:15.805847 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.806202 kubelet[3215]: W0906 00:10:15.805879 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.806202 kubelet[3215]: E0906 00:10:15.805910 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.806774 kubelet[3215]: E0906 00:10:15.806715 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.807015 kubelet[3215]: W0906 00:10:15.806987 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.807288 kubelet[3215]: E0906 00:10:15.807164 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.809163 kubelet[3215]: E0906 00:10:15.808957 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.809163 kubelet[3215]: W0906 00:10:15.808992 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.809163 kubelet[3215]: E0906 00:10:15.809024 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.863942 kubelet[3215]: E0906 00:10:15.863704 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.863942 kubelet[3215]: W0906 00:10:15.863762 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.863942 kubelet[3215]: E0906 00:10:15.863795 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.864481 kubelet[3215]: E0906 00:10:15.864317 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.864481 kubelet[3215]: W0906 00:10:15.864340 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.864481 kubelet[3215]: E0906 00:10:15.864379 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.864974 kubelet[3215]: E0906 00:10:15.864946 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.865049 kubelet[3215]: W0906 00:10:15.864973 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.865049 kubelet[3215]: E0906 00:10:15.865014 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.865570 kubelet[3215]: E0906 00:10:15.865536 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.865570 kubelet[3215]: W0906 00:10:15.865569 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.865812 kubelet[3215]: E0906 00:10:15.865694 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.865954 kubelet[3215]: E0906 00:10:15.865924 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.866029 kubelet[3215]: W0906 00:10:15.865952 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.866318 kubelet[3215]: E0906 00:10:15.866061 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.866607 kubelet[3215]: E0906 00:10:15.866576 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.866691 kubelet[3215]: W0906 00:10:15.866607 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.866835 kubelet[3215]: E0906 00:10:15.866738 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.866997 kubelet[3215]: E0906 00:10:15.866971 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.867071 kubelet[3215]: W0906 00:10:15.866997 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.867071 kubelet[3215]: E0906 00:10:15.867033 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.867512 kubelet[3215]: E0906 00:10:15.867442 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.867512 kubelet[3215]: W0906 00:10:15.867471 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.867621 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.867870 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.868761 kubelet[3215]: W0906 00:10:15.867886 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.867926 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.868256 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.868761 kubelet[3215]: W0906 00:10:15.868273 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.868316 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.868639 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.868761 kubelet[3215]: W0906 00:10:15.868656 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.868761 kubelet[3215]: E0906 00:10:15.868688 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.869736 kubelet[3215]: E0906 00:10:15.869332 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.869736 kubelet[3215]: W0906 00:10:15.869370 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.869736 kubelet[3215]: E0906 00:10:15.869424 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.870453 kubelet[3215]: E0906 00:10:15.870419 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.870628 kubelet[3215]: W0906 00:10:15.870453 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.870628 kubelet[3215]: E0906 00:10:15.870579 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.870910 kubelet[3215]: E0906 00:10:15.870884 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.870995 kubelet[3215]: W0906 00:10:15.870942 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.870995 kubelet[3215]: E0906 00:10:15.870981 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.871589 kubelet[3215]: E0906 00:10:15.871559 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.871589 kubelet[3215]: W0906 00:10:15.871589 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.871754 kubelet[3215]: E0906 00:10:15.871627 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.873210 kubelet[3215]: E0906 00:10:15.872701 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.873210 kubelet[3215]: W0906 00:10:15.872731 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.873210 kubelet[3215]: E0906 00:10:15.872778 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.873417 kubelet[3215]: E0906 00:10:15.873243 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.873417 kubelet[3215]: W0906 00:10:15.873266 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.873417 kubelet[3215]: E0906 00:10:15.873291 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:15.874341 kubelet[3215]: E0906 00:10:15.874312 3215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 00:10:15.874487 kubelet[3215]: W0906 00:10:15.874464 3215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 00:10:15.874643 kubelet[3215]: E0906 00:10:15.874592 3215 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 00:10:16.249011 containerd[2004]: time="2025-09-06T00:10:16.248947654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:16.250874 containerd[2004]: time="2025-09-06T00:10:16.250794718Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 6 00:10:16.253188 containerd[2004]: time="2025-09-06T00:10:16.253061422Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:16.259112 containerd[2004]: time="2025-09-06T00:10:16.258548854Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:16.260224 containerd[2004]: time="2025-09-06T00:10:16.260109106Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.492977847s" Sep 6 00:10:16.260339 containerd[2004]: time="2025-09-06T00:10:16.260236294Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 6 00:10:16.266795 containerd[2004]: time="2025-09-06T00:10:16.266735446Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 6 00:10:16.309083 containerd[2004]: time="2025-09-06T00:10:16.309027035Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824\"" Sep 6 00:10:16.311231 containerd[2004]: time="2025-09-06T00:10:16.310380863Z" level=info msg="StartContainer for \"972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824\"" Sep 6 00:10:16.374487 systemd[1]: Started cri-containerd-972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824.scope - libcontainer container 972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824. Sep 6 00:10:16.433219 containerd[2004]: time="2025-09-06T00:10:16.433123019Z" level=info msg="StartContainer for \"972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824\" returns successfully" Sep 6 00:10:16.481219 systemd[1]: cri-containerd-972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824.scope: Deactivated successfully. Sep 6 00:10:16.659223 containerd[2004]: time="2025-09-06T00:10:16.659018436Z" level=info msg="shim disconnected" id=972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824 namespace=k8s.io Sep 6 00:10:16.660313 containerd[2004]: time="2025-09-06T00:10:16.660221196Z" level=warning msg="cleaning up after shim disconnected" id=972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824 namespace=k8s.io Sep 6 00:10:16.660547 containerd[2004]: time="2025-09-06T00:10:16.660366060Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:10:16.736911 containerd[2004]: time="2025-09-06T00:10:16.736444933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 6 00:10:16.775954 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-972ab47352b7ba4468db7c55a9ee3c0bde6e9765f0731e6d0b69df267a1ae824-rootfs.mount: Deactivated successfully. Sep 6 00:10:17.411369 kubelet[3215]: E0906 00:10:17.411281 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:19.410108 kubelet[3215]: E0906 00:10:19.409623 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:20.513313 containerd[2004]: time="2025-09-06T00:10:20.513181851Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:20.515724 containerd[2004]: time="2025-09-06T00:10:20.515032791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 6 00:10:20.515724 containerd[2004]: time="2025-09-06T00:10:20.515660547Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:20.520241 containerd[2004]: time="2025-09-06T00:10:20.519653392Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:20.522486 containerd[2004]: time="2025-09-06T00:10:20.521453128Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.784941667s" Sep 6 00:10:20.522486 containerd[2004]: time="2025-09-06T00:10:20.521512708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 6 00:10:20.528205 containerd[2004]: time="2025-09-06T00:10:20.528101548Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 6 00:10:20.551174 containerd[2004]: time="2025-09-06T00:10:20.549419812Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3\"" Sep 6 00:10:20.553344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4115108336.mount: Deactivated successfully. Sep 6 00:10:20.555389 containerd[2004]: time="2025-09-06T00:10:20.553316824Z" level=info msg="StartContainer for \"897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3\"" Sep 6 00:10:20.614894 systemd[1]: Started cri-containerd-897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3.scope - libcontainer container 897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3. Sep 6 00:10:20.670336 containerd[2004]: time="2025-09-06T00:10:20.670263856Z" level=info msg="StartContainer for \"897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3\" returns successfully" Sep 6 00:10:21.408458 kubelet[3215]: E0906 00:10:21.408389 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:21.776133 containerd[2004]: time="2025-09-06T00:10:21.776068794Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 6 00:10:21.781741 systemd[1]: cri-containerd-897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3.scope: Deactivated successfully. Sep 6 00:10:21.829030 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3-rootfs.mount: Deactivated successfully. Sep 6 00:10:21.881570 kubelet[3215]: I0906 00:10:21.880440 3215 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 6 00:10:21.956048 systemd[1]: Created slice kubepods-burstable-pode1a579ff_cf0a_431d_b0c4_8c9feaa7cd01.slice - libcontainer container kubepods-burstable-pode1a579ff_cf0a_431d_b0c4_8c9feaa7cd01.slice. Sep 6 00:10:21.988965 systemd[1]: Created slice kubepods-besteffort-podaeae33f4_c2c1_4f17_a617_26feec2636ec.slice - libcontainer container kubepods-besteffort-podaeae33f4_c2c1_4f17_a617_26feec2636ec.slice. Sep 6 00:10:22.013905 systemd[1]: Created slice kubepods-burstable-podb8081eb5_874c_4704_87a1_8d99ed0c3d28.slice - libcontainer container kubepods-burstable-podb8081eb5_874c_4704_87a1_8d99ed0c3d28.slice. Sep 6 00:10:22.025718 kubelet[3215]: I0906 00:10:22.024606 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01-config-volume\") pod \"coredns-668d6bf9bc-wrl6g\" (UID: \"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01\") " pod="kube-system/coredns-668d6bf9bc-wrl6g" Sep 6 00:10:22.028571 kubelet[3215]: I0906 00:10:22.028398 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aeae33f4-c2c1-4f17-a617-26feec2636ec-tigera-ca-bundle\") pod \"calico-kube-controllers-5cf8b65958-8qctx\" (UID: \"aeae33f4-c2c1-4f17-a617-26feec2636ec\") " pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" Sep 6 00:10:22.030160 kubelet[3215]: I0906 00:10:22.029290 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnfn\" (UniqueName: \"kubernetes.io/projected/e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01-kube-api-access-whnfn\") pod \"coredns-668d6bf9bc-wrl6g\" (UID: \"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01\") " pod="kube-system/coredns-668d6bf9bc-wrl6g" Sep 6 00:10:22.031534 kubelet[3215]: I0906 00:10:22.031284 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bv7v\" (UniqueName: \"kubernetes.io/projected/aeae33f4-c2c1-4f17-a617-26feec2636ec-kube-api-access-9bv7v\") pod \"calico-kube-controllers-5cf8b65958-8qctx\" (UID: \"aeae33f4-c2c1-4f17-a617-26feec2636ec\") " pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" Sep 6 00:10:22.050626 systemd[1]: Created slice kubepods-besteffort-pod6141a3f9_cce3_4c89_a723_17bc0b75f562.slice - libcontainer container kubepods-besteffort-pod6141a3f9_cce3_4c89_a723_17bc0b75f562.slice. Sep 6 00:10:22.079270 systemd[1]: Created slice kubepods-besteffort-podb640f503_9253_4063_8780_3c12db4c2e29.slice - libcontainer container kubepods-besteffort-podb640f503_9253_4063_8780_3c12db4c2e29.slice. Sep 6 00:10:22.100282 systemd[1]: Created slice kubepods-besteffort-pod9733554c_54e5_4004_bf21_b3d55e01bf9c.slice - libcontainer container kubepods-besteffort-pod9733554c_54e5_4004_bf21_b3d55e01bf9c.slice. Sep 6 00:10:22.120509 systemd[1]: Created slice kubepods-besteffort-pod769bd715_8e93_4657_b3c2_d5c87457529b.slice - libcontainer container kubepods-besteffort-pod769bd715_8e93_4657_b3c2_d5c87457529b.slice. Sep 6 00:10:22.131956 kubelet[3215]: I0906 00:10:22.131886 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b640f503-9253-4063-8780-3c12db4c2e29-config\") pod \"goldmane-54d579b49d-fdn4p\" (UID: \"b640f503-9253-4063-8780-3c12db4c2e29\") " pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.132123 kubelet[3215]: I0906 00:10:22.131987 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-ca-bundle\") pod \"whisker-7c8bff6bcd-nk2q2\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " pod="calico-system/whisker-7c8bff6bcd-nk2q2" Sep 6 00:10:22.132123 kubelet[3215]: I0906 00:10:22.132060 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8081eb5-874c-4704-87a1-8d99ed0c3d28-config-volume\") pod \"coredns-668d6bf9bc-cfg8t\" (UID: \"b8081eb5-874c-4704-87a1-8d99ed0c3d28\") " pod="kube-system/coredns-668d6bf9bc-cfg8t" Sep 6 00:10:22.132123 kubelet[3215]: I0906 00:10:22.132107 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b640f503-9253-4063-8780-3c12db4c2e29-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-fdn4p\" (UID: \"b640f503-9253-4063-8780-3c12db4c2e29\") " pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.132355 kubelet[3215]: I0906 00:10:22.132179 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-backend-key-pair\") pod \"whisker-7c8bff6bcd-nk2q2\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " pod="calico-system/whisker-7c8bff6bcd-nk2q2" Sep 6 00:10:22.132355 kubelet[3215]: I0906 00:10:22.132229 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljt8l\" (UniqueName: \"kubernetes.io/projected/6141a3f9-cce3-4c89-a723-17bc0b75f562-kube-api-access-ljt8l\") pod \"calico-apiserver-8966f54db-8hj4r\" (UID: \"6141a3f9-cce3-4c89-a723-17bc0b75f562\") " pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" Sep 6 00:10:22.132355 kubelet[3215]: I0906 00:10:22.132268 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b640f503-9253-4063-8780-3c12db4c2e29-goldmane-key-pair\") pod \"goldmane-54d579b49d-fdn4p\" (UID: \"b640f503-9253-4063-8780-3c12db4c2e29\") " pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.132355 kubelet[3215]: I0906 00:10:22.132352 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsn7q\" (UniqueName: \"kubernetes.io/projected/b640f503-9253-4063-8780-3c12db4c2e29-kube-api-access-gsn7q\") pod \"goldmane-54d579b49d-fdn4p\" (UID: \"b640f503-9253-4063-8780-3c12db4c2e29\") " pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.132579 kubelet[3215]: I0906 00:10:22.132399 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4dm\" (UniqueName: \"kubernetes.io/projected/b8081eb5-874c-4704-87a1-8d99ed0c3d28-kube-api-access-rp4dm\") pod \"coredns-668d6bf9bc-cfg8t\" (UID: \"b8081eb5-874c-4704-87a1-8d99ed0c3d28\") " pod="kube-system/coredns-668d6bf9bc-cfg8t" Sep 6 00:10:22.132579 kubelet[3215]: I0906 00:10:22.132446 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9733554c-54e5-4004-bf21-b3d55e01bf9c-calico-apiserver-certs\") pod \"calico-apiserver-8966f54db-5528p\" (UID: \"9733554c-54e5-4004-bf21-b3d55e01bf9c\") " pod="calico-apiserver/calico-apiserver-8966f54db-5528p" Sep 6 00:10:22.132579 kubelet[3215]: I0906 00:10:22.132482 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6141a3f9-cce3-4c89-a723-17bc0b75f562-calico-apiserver-certs\") pod \"calico-apiserver-8966f54db-8hj4r\" (UID: \"6141a3f9-cce3-4c89-a723-17bc0b75f562\") " pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" Sep 6 00:10:22.132579 kubelet[3215]: I0906 00:10:22.132520 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr8l\" (UniqueName: \"kubernetes.io/projected/769bd715-8e93-4657-b3c2-d5c87457529b-kube-api-access-jxr8l\") pod \"whisker-7c8bff6bcd-nk2q2\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " pod="calico-system/whisker-7c8bff6bcd-nk2q2" Sep 6 00:10:22.132891 kubelet[3215]: I0906 00:10:22.132630 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphvk\" (UniqueName: \"kubernetes.io/projected/9733554c-54e5-4004-bf21-b3d55e01bf9c-kube-api-access-cphvk\") pod \"calico-apiserver-8966f54db-5528p\" (UID: \"9733554c-54e5-4004-bf21-b3d55e01bf9c\") " pod="calico-apiserver/calico-apiserver-8966f54db-5528p" Sep 6 00:10:22.298446 containerd[2004]: time="2025-09-06T00:10:22.297490684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wrl6g,Uid:e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01,Namespace:kube-system,Attempt:0,}" Sep 6 00:10:22.327502 containerd[2004]: time="2025-09-06T00:10:22.327416800Z" level=info msg="shim disconnected" id=897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3 namespace=k8s.io Sep 6 00:10:22.327683 containerd[2004]: time="2025-09-06T00:10:22.327533812Z" level=warning msg="cleaning up after shim disconnected" id=897b83fa6006d17c0f4535841ba26631d6a0cbf08ee9b51ce75148bc25779eb3 namespace=k8s.io Sep 6 00:10:22.327683 containerd[2004]: time="2025-09-06T00:10:22.327603880Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:10:22.341598 containerd[2004]: time="2025-09-06T00:10:22.338744093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cf8b65958-8qctx,Uid:aeae33f4-c2c1-4f17-a617-26feec2636ec,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:22.395752 containerd[2004]: time="2025-09-06T00:10:22.395495957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fdn4p,Uid:b640f503-9253-4063-8780-3c12db4c2e29,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:22.411274 containerd[2004]: time="2025-09-06T00:10:22.411193301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-5528p,Uid:9733554c-54e5-4004-bf21-b3d55e01bf9c,Namespace:calico-apiserver,Attempt:0,}" Sep 6 00:10:22.427991 containerd[2004]: time="2025-09-06T00:10:22.427933373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c8bff6bcd-nk2q2,Uid:769bd715-8e93-4657-b3c2-d5c87457529b,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:22.640355 containerd[2004]: time="2025-09-06T00:10:22.639595578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cfg8t,Uid:b8081eb5-874c-4704-87a1-8d99ed0c3d28,Namespace:kube-system,Attempt:0,}" Sep 6 00:10:22.666660 containerd[2004]: time="2025-09-06T00:10:22.666289914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-8hj4r,Uid:6141a3f9-cce3-4c89-a723-17bc0b75f562,Namespace:calico-apiserver,Attempt:0,}" Sep 6 00:10:22.689384 containerd[2004]: time="2025-09-06T00:10:22.689306358Z" level=error msg="Failed to destroy network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.695682 containerd[2004]: time="2025-09-06T00:10:22.695605194Z" level=error msg="encountered an error cleaning up failed sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.695970 containerd[2004]: time="2025-09-06T00:10:22.695923842Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wrl6g,Uid:e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.696475 kubelet[3215]: E0906 00:10:22.696392 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.697440 kubelet[3215]: E0906 00:10:22.697249 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wrl6g" Sep 6 00:10:22.697670 kubelet[3215]: E0906 00:10:22.697630 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wrl6g" Sep 6 00:10:22.700697 kubelet[3215]: E0906 00:10:22.697993 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wrl6g_kube-system(e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wrl6g_kube-system(e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wrl6g" podUID="e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01" Sep 6 00:10:22.738641 containerd[2004]: time="2025-09-06T00:10:22.738548407Z" level=error msg="Failed to destroy network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.741651 containerd[2004]: time="2025-09-06T00:10:22.741585715Z" level=error msg="encountered an error cleaning up failed sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.741883 containerd[2004]: time="2025-09-06T00:10:22.741841363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cf8b65958-8qctx,Uid:aeae33f4-c2c1-4f17-a617-26feec2636ec,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.745294 kubelet[3215]: E0906 00:10:22.743564 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.745294 kubelet[3215]: E0906 00:10:22.743672 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" Sep 6 00:10:22.745294 kubelet[3215]: E0906 00:10:22.743731 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" Sep 6 00:10:22.745598 kubelet[3215]: E0906 00:10:22.743927 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5cf8b65958-8qctx_calico-system(aeae33f4-c2c1-4f17-a617-26feec2636ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5cf8b65958-8qctx_calico-system(aeae33f4-c2c1-4f17-a617-26feec2636ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" podUID="aeae33f4-c2c1-4f17-a617-26feec2636ec" Sep 6 00:10:22.763109 kubelet[3215]: I0906 00:10:22.763058 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:22.770811 containerd[2004]: time="2025-09-06T00:10:22.770747875Z" level=info msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" Sep 6 00:10:22.771586 containerd[2004]: time="2025-09-06T00:10:22.771544003Z" level=info msg="Ensure that sandbox 43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0 in task-service has been cleanup successfully" Sep 6 00:10:22.777253 containerd[2004]: time="2025-09-06T00:10:22.775704223Z" level=error msg="Failed to destroy network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.789575 containerd[2004]: time="2025-09-06T00:10:22.789455791Z" level=error msg="encountered an error cleaning up failed sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.789575 containerd[2004]: time="2025-09-06T00:10:22.789558883Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fdn4p,Uid:b640f503-9253-4063-8780-3c12db4c2e29,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.790457 kubelet[3215]: E0906 00:10:22.790367 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.790603 kubelet[3215]: E0906 00:10:22.790465 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.790603 kubelet[3215]: E0906 00:10:22.790501 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fdn4p" Sep 6 00:10:22.790603 kubelet[3215]: E0906 00:10:22.790560 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-fdn4p_calico-system(b640f503-9253-4063-8780-3c12db4c2e29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-fdn4p_calico-system(b640f503-9253-4063-8780-3c12db4c2e29)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-fdn4p" podUID="b640f503-9253-4063-8780-3c12db4c2e29" Sep 6 00:10:22.795693 containerd[2004]: time="2025-09-06T00:10:22.795261487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 6 00:10:22.801641 kubelet[3215]: I0906 00:10:22.801556 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:22.804959 containerd[2004]: time="2025-09-06T00:10:22.804578119Z" level=info msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" Sep 6 00:10:22.811034 containerd[2004]: time="2025-09-06T00:10:22.810472051Z" level=info msg="Ensure that sandbox f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf in task-service has been cleanup successfully" Sep 6 00:10:22.899013 containerd[2004]: time="2025-09-06T00:10:22.898865287Z" level=error msg="Failed to destroy network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.902761 containerd[2004]: time="2025-09-06T00:10:22.901342015Z" level=error msg="Failed to destroy network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.907853 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57-shm.mount: Deactivated successfully. Sep 6 00:10:22.913115 containerd[2004]: time="2025-09-06T00:10:22.913012411Z" level=error msg="encountered an error cleaning up failed sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.916185 containerd[2004]: time="2025-09-06T00:10:22.915126739Z" level=error msg="encountered an error cleaning up failed sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.916185 containerd[2004]: time="2025-09-06T00:10:22.915273091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c8bff6bcd-nk2q2,Uid:769bd715-8e93-4657-b3c2-d5c87457529b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.916957 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f-shm.mount: Deactivated successfully. Sep 6 00:10:22.917821 kubelet[3215]: E0906 00:10:22.917772 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.918195 kubelet[3215]: E0906 00:10:22.918091 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c8bff6bcd-nk2q2" Sep 6 00:10:22.918681 kubelet[3215]: E0906 00:10:22.918337 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7c8bff6bcd-nk2q2" Sep 6 00:10:22.918681 kubelet[3215]: E0906 00:10:22.918448 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7c8bff6bcd-nk2q2_calico-system(769bd715-8e93-4657-b3c2-d5c87457529b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7c8bff6bcd-nk2q2_calico-system(769bd715-8e93-4657-b3c2-d5c87457529b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7c8bff6bcd-nk2q2" podUID="769bd715-8e93-4657-b3c2-d5c87457529b" Sep 6 00:10:22.922021 containerd[2004]: time="2025-09-06T00:10:22.921215047Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-5528p,Uid:9733554c-54e5-4004-bf21-b3d55e01bf9c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.922259 kubelet[3215]: E0906 00:10:22.921507 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:22.922349 kubelet[3215]: E0906 00:10:22.922235 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8966f54db-5528p" Sep 6 00:10:22.922349 kubelet[3215]: E0906 00:10:22.922313 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8966f54db-5528p" Sep 6 00:10:22.922461 kubelet[3215]: E0906 00:10:22.922377 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8966f54db-5528p_calico-apiserver(9733554c-54e5-4004-bf21-b3d55e01bf9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8966f54db-5528p_calico-apiserver(9733554c-54e5-4004-bf21-b3d55e01bf9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8966f54db-5528p" podUID="9733554c-54e5-4004-bf21-b3d55e01bf9c" Sep 6 00:10:23.004893 containerd[2004]: time="2025-09-06T00:10:23.004568704Z" level=error msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" failed" error="failed to destroy network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.005589 kubelet[3215]: E0906 00:10:23.005232 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:23.005589 kubelet[3215]: E0906 00:10:23.005319 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0"} Sep 6 00:10:23.005589 kubelet[3215]: E0906 00:10:23.005408 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"aeae33f4-c2c1-4f17-a617-26feec2636ec\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:23.005589 kubelet[3215]: E0906 00:10:23.005448 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"aeae33f4-c2c1-4f17-a617-26feec2636ec\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" podUID="aeae33f4-c2c1-4f17-a617-26feec2636ec" Sep 6 00:10:23.024921 containerd[2004]: time="2025-09-06T00:10:23.024723304Z" level=error msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" failed" error="failed to destroy network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.025388 kubelet[3215]: E0906 00:10:23.025325 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:23.025388 kubelet[3215]: E0906 00:10:23.025399 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf"} Sep 6 00:10:23.025638 kubelet[3215]: E0906 00:10:23.025465 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:23.025638 kubelet[3215]: E0906 00:10:23.025505 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wrl6g" podUID="e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01" Sep 6 00:10:23.043867 containerd[2004]: time="2025-09-06T00:10:23.042304060Z" level=error msg="Failed to destroy network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.047310 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769-shm.mount: Deactivated successfully. Sep 6 00:10:23.048865 containerd[2004]: time="2025-09-06T00:10:23.047225656Z" level=error msg="encountered an error cleaning up failed sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.050761 containerd[2004]: time="2025-09-06T00:10:23.049355152Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cfg8t,Uid:b8081eb5-874c-4704-87a1-8d99ed0c3d28,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.051579 kubelet[3215]: E0906 00:10:23.051413 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.051579 kubelet[3215]: E0906 00:10:23.051496 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cfg8t" Sep 6 00:10:23.051579 kubelet[3215]: E0906 00:10:23.051538 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cfg8t" Sep 6 00:10:23.054053 kubelet[3215]: E0906 00:10:23.051609 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-cfg8t_kube-system(b8081eb5-874c-4704-87a1-8d99ed0c3d28)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-cfg8t_kube-system(b8081eb5-874c-4704-87a1-8d99ed0c3d28)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-cfg8t" podUID="b8081eb5-874c-4704-87a1-8d99ed0c3d28" Sep 6 00:10:23.060302 containerd[2004]: time="2025-09-06T00:10:23.060211240Z" level=error msg="Failed to destroy network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.062695 containerd[2004]: time="2025-09-06T00:10:23.062589748Z" level=error msg="encountered an error cleaning up failed sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.062798 containerd[2004]: time="2025-09-06T00:10:23.062715028Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-8hj4r,Uid:6141a3f9-cce3-4c89-a723-17bc0b75f562,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.065018 kubelet[3215]: E0906 00:10:23.064324 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.065018 kubelet[3215]: E0906 00:10:23.064399 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" Sep 6 00:10:23.065018 kubelet[3215]: E0906 00:10:23.064432 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" Sep 6 00:10:23.065414 kubelet[3215]: E0906 00:10:23.064499 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8966f54db-8hj4r_calico-apiserver(6141a3f9-cce3-4c89-a723-17bc0b75f562)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8966f54db-8hj4r_calico-apiserver(6141a3f9-cce3-4c89-a723-17bc0b75f562)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" podUID="6141a3f9-cce3-4c89-a723-17bc0b75f562" Sep 6 00:10:23.066599 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4-shm.mount: Deactivated successfully. Sep 6 00:10:23.420786 systemd[1]: Created slice kubepods-besteffort-pod44f64470_7851_42aa_8ed3_eb71e8151f7c.slice - libcontainer container kubepods-besteffort-pod44f64470_7851_42aa_8ed3_eb71e8151f7c.slice. Sep 6 00:10:23.425984 containerd[2004]: time="2025-09-06T00:10:23.425922630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47vdk,Uid:44f64470-7851-42aa-8ed3-eb71e8151f7c,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:23.532579 containerd[2004]: time="2025-09-06T00:10:23.532407918Z" level=error msg="Failed to destroy network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.533367 containerd[2004]: time="2025-09-06T00:10:23.533126370Z" level=error msg="encountered an error cleaning up failed sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.533367 containerd[2004]: time="2025-09-06T00:10:23.533250006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47vdk,Uid:44f64470-7851-42aa-8ed3-eb71e8151f7c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.533870 kubelet[3215]: E0906 00:10:23.533807 3215 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.533971 kubelet[3215]: E0906 00:10:23.533902 3215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:23.533971 kubelet[3215]: E0906 00:10:23.533938 3215 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-47vdk" Sep 6 00:10:23.534092 kubelet[3215]: E0906 00:10:23.534008 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-47vdk_calico-system(44f64470-7851-42aa-8ed3-eb71e8151f7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-47vdk_calico-system(44f64470-7851-42aa-8ed3-eb71e8151f7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:23.805849 kubelet[3215]: I0906 00:10:23.805797 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:23.809365 containerd[2004]: time="2025-09-06T00:10:23.809224172Z" level=info msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" Sep 6 00:10:23.809961 containerd[2004]: time="2025-09-06T00:10:23.809531168Z" level=info msg="Ensure that sandbox 4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57 in task-service has been cleanup successfully" Sep 6 00:10:23.812231 kubelet[3215]: I0906 00:10:23.811127 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:23.815982 containerd[2004]: time="2025-09-06T00:10:23.814320176Z" level=info msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" Sep 6 00:10:23.815982 containerd[2004]: time="2025-09-06T00:10:23.814620284Z" level=info msg="Ensure that sandbox 635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba in task-service has been cleanup successfully" Sep 6 00:10:23.819082 kubelet[3215]: I0906 00:10:23.818860 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:23.822927 containerd[2004]: time="2025-09-06T00:10:23.822781808Z" level=info msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" Sep 6 00:10:23.824809 containerd[2004]: time="2025-09-06T00:10:23.824335760Z" level=info msg="Ensure that sandbox 18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55 in task-service has been cleanup successfully" Sep 6 00:10:23.827312 kubelet[3215]: I0906 00:10:23.825765 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:23.837408 containerd[2004]: time="2025-09-06T00:10:23.829704008Z" level=info msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" Sep 6 00:10:23.838514 containerd[2004]: time="2025-09-06T00:10:23.838191752Z" level=info msg="Ensure that sandbox 4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4 in task-service has been cleanup successfully" Sep 6 00:10:23.843770 kubelet[3215]: I0906 00:10:23.842783 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:23.846835 containerd[2004]: time="2025-09-06T00:10:23.846732488Z" level=info msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" Sep 6 00:10:23.847579 containerd[2004]: time="2025-09-06T00:10:23.847494488Z" level=info msg="Ensure that sandbox 20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769 in task-service has been cleanup successfully" Sep 6 00:10:23.857929 kubelet[3215]: I0906 00:10:23.857303 3215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:23.858278 containerd[2004]: time="2025-09-06T00:10:23.858212876Z" level=info msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" Sep 6 00:10:23.858569 containerd[2004]: time="2025-09-06T00:10:23.858511916Z" level=info msg="Ensure that sandbox e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f in task-service has been cleanup successfully" Sep 6 00:10:23.966347 containerd[2004]: time="2025-09-06T00:10:23.966281745Z" level=error msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" failed" error="failed to destroy network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.966938 kubelet[3215]: E0906 00:10:23.966869 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:23.967096 kubelet[3215]: E0906 00:10:23.966944 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769"} Sep 6 00:10:23.967096 kubelet[3215]: E0906 00:10:23.967005 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b8081eb5-874c-4704-87a1-8d99ed0c3d28\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:23.967096 kubelet[3215]: E0906 00:10:23.967044 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b8081eb5-874c-4704-87a1-8d99ed0c3d28\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-cfg8t" podUID="b8081eb5-874c-4704-87a1-8d99ed0c3d28" Sep 6 00:10:23.969858 containerd[2004]: time="2025-09-06T00:10:23.969786117Z" level=error msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" failed" error="failed to destroy network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:23.970292 kubelet[3215]: E0906 00:10:23.970161 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:23.970292 kubelet[3215]: E0906 00:10:23.970252 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4"} Sep 6 00:10:23.970292 kubelet[3215]: E0906 00:10:23.970309 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6141a3f9-cce3-4c89-a723-17bc0b75f562\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:23.970292 kubelet[3215]: E0906 00:10:23.970356 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6141a3f9-cce3-4c89-a723-17bc0b75f562\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" podUID="6141a3f9-cce3-4c89-a723-17bc0b75f562" Sep 6 00:10:24.010837 containerd[2004]: time="2025-09-06T00:10:24.009374885Z" level=error msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" failed" error="failed to destroy network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:24.012194 kubelet[3215]: E0906 00:10:24.011826 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:24.012194 kubelet[3215]: E0906 00:10:24.011919 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba"} Sep 6 00:10:24.012194 kubelet[3215]: E0906 00:10:24.011973 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b640f503-9253-4063-8780-3c12db4c2e29\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:24.012194 kubelet[3215]: E0906 00:10:24.012014 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b640f503-9253-4063-8780-3c12db4c2e29\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-fdn4p" podUID="b640f503-9253-4063-8780-3c12db4c2e29" Sep 6 00:10:24.017982 containerd[2004]: time="2025-09-06T00:10:24.017666597Z" level=error msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" failed" error="failed to destroy network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:24.018387 kubelet[3215]: E0906 00:10:24.018022 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:24.018387 kubelet[3215]: E0906 00:10:24.018088 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57"} Sep 6 00:10:24.018387 kubelet[3215]: E0906 00:10:24.018165 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"769bd715-8e93-4657-b3c2-d5c87457529b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:24.018387 kubelet[3215]: E0906 00:10:24.018208 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"769bd715-8e93-4657-b3c2-d5c87457529b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7c8bff6bcd-nk2q2" podUID="769bd715-8e93-4657-b3c2-d5c87457529b" Sep 6 00:10:24.018728 kubelet[3215]: E0906 00:10:24.018669 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:24.018728 kubelet[3215]: E0906 00:10:24.018717 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f"} Sep 6 00:10:24.018895 containerd[2004]: time="2025-09-06T00:10:24.018438713Z" level=error msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" failed" error="failed to destroy network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:24.018955 kubelet[3215]: E0906 00:10:24.018769 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9733554c-54e5-4004-bf21-b3d55e01bf9c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:24.018955 kubelet[3215]: E0906 00:10:24.018806 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9733554c-54e5-4004-bf21-b3d55e01bf9c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8966f54db-5528p" podUID="9733554c-54e5-4004-bf21-b3d55e01bf9c" Sep 6 00:10:24.023592 containerd[2004]: time="2025-09-06T00:10:24.023502377Z" level=error msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" failed" error="failed to destroy network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 00:10:24.023856 kubelet[3215]: E0906 00:10:24.023801 3215 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:24.023949 kubelet[3215]: E0906 00:10:24.023874 3215 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55"} Sep 6 00:10:24.023949 kubelet[3215]: E0906 00:10:24.023929 3215 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"44f64470-7851-42aa-8ed3-eb71e8151f7c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 6 00:10:24.024097 kubelet[3215]: E0906 00:10:24.023974 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"44f64470-7851-42aa-8ed3-eb71e8151f7c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-47vdk" podUID="44f64470-7851-42aa-8ed3-eb71e8151f7c" Sep 6 00:10:31.129942 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1421638326.mount: Deactivated successfully. Sep 6 00:10:31.180742 containerd[2004]: time="2025-09-06T00:10:31.180655104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:31.183094 containerd[2004]: time="2025-09-06T00:10:31.182807160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 6 00:10:31.184192 containerd[2004]: time="2025-09-06T00:10:31.184018260Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:31.188433 containerd[2004]: time="2025-09-06T00:10:31.188350489Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:31.190081 containerd[2004]: time="2025-09-06T00:10:31.189889753Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 8.394547878s" Sep 6 00:10:31.190081 containerd[2004]: time="2025-09-06T00:10:31.189948109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 6 00:10:31.231486 containerd[2004]: time="2025-09-06T00:10:31.229434385Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 6 00:10:31.257820 containerd[2004]: time="2025-09-06T00:10:31.257744557Z" level=info msg="CreateContainer within sandbox \"1b3d60023b7563f7cf382b011938c1bb2573cac4dbcaba405be6c6d6dd0bbd85\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de\"" Sep 6 00:10:31.260912 containerd[2004]: time="2025-09-06T00:10:31.260069977Z" level=info msg="StartContainer for \"005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de\"" Sep 6 00:10:31.260550 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount44517880.mount: Deactivated successfully. Sep 6 00:10:31.311466 systemd[1]: Started cri-containerd-005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de.scope - libcontainer container 005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de. Sep 6 00:10:31.389348 containerd[2004]: time="2025-09-06T00:10:31.388682762Z" level=info msg="StartContainer for \"005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de\" returns successfully" Sep 6 00:10:31.655430 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 6 00:10:31.655675 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 6 00:10:31.851875 containerd[2004]: time="2025-09-06T00:10:31.851784028Z" level=info msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" Sep 6 00:10:31.962962 kubelet[3215]: I0906 00:10:31.962011 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gt6tt" podStartSLOduration=1.679703168 podStartE2EDuration="20.961986496s" podCreationTimestamp="2025-09-06 00:10:11 +0000 UTC" firstStartedPulling="2025-09-06 00:10:11.909448053 +0000 UTC m=+32.993490729" lastFinishedPulling="2025-09-06 00:10:31.191731381 +0000 UTC m=+52.275774057" observedRunningTime="2025-09-06 00:10:31.95800036 +0000 UTC m=+53.042043072" watchObservedRunningTime="2025-09-06 00:10:31.961986496 +0000 UTC m=+53.046029160" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.108 [INFO][4653] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.108 [INFO][4653] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" iface="eth0" netns="/var/run/netns/cni-cc6fa285-fa63-b781-8156-e7b05a0a10af" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.111 [INFO][4653] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" iface="eth0" netns="/var/run/netns/cni-cc6fa285-fa63-b781-8156-e7b05a0a10af" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.113 [INFO][4653] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" iface="eth0" netns="/var/run/netns/cni-cc6fa285-fa63-b781-8156-e7b05a0a10af" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.113 [INFO][4653] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.113 [INFO][4653] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.284 [INFO][4683] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.284 [INFO][4683] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.284 [INFO][4683] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.299 [WARNING][4683] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.300 [INFO][4683] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.304 [INFO][4683] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:32.320819 containerd[2004]: 2025-09-06 00:10:32.315 [INFO][4653] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:32.324185 containerd[2004]: time="2025-09-06T00:10:32.322326794Z" level=info msg="TearDown network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" successfully" Sep 6 00:10:32.324185 containerd[2004]: time="2025-09-06T00:10:32.322410398Z" level=info msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" returns successfully" Sep 6 00:10:32.328218 systemd[1]: run-netns-cni\x2dcc6fa285\x2dfa63\x2db781\x2d8156\x2de7b05a0a10af.mount: Deactivated successfully. Sep 6 00:10:32.417851 kubelet[3215]: I0906 00:10:32.417782 3215 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxr8l\" (UniqueName: \"kubernetes.io/projected/769bd715-8e93-4657-b3c2-d5c87457529b-kube-api-access-jxr8l\") pod \"769bd715-8e93-4657-b3c2-d5c87457529b\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " Sep 6 00:10:32.418103 kubelet[3215]: I0906 00:10:32.417891 3215 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-ca-bundle\") pod \"769bd715-8e93-4657-b3c2-d5c87457529b\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " Sep 6 00:10:32.418103 kubelet[3215]: I0906 00:10:32.417940 3215 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-backend-key-pair\") pod \"769bd715-8e93-4657-b3c2-d5c87457529b\" (UID: \"769bd715-8e93-4657-b3c2-d5c87457529b\") " Sep 6 00:10:32.424203 kubelet[3215]: I0906 00:10:32.423052 3215 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "769bd715-8e93-4657-b3c2-d5c87457529b" (UID: "769bd715-8e93-4657-b3c2-d5c87457529b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 6 00:10:32.430174 kubelet[3215]: I0906 00:10:32.428238 3215 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769bd715-8e93-4657-b3c2-d5c87457529b-kube-api-access-jxr8l" (OuterVolumeSpecName: "kube-api-access-jxr8l") pod "769bd715-8e93-4657-b3c2-d5c87457529b" (UID: "769bd715-8e93-4657-b3c2-d5c87457529b"). InnerVolumeSpecName "kube-api-access-jxr8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 6 00:10:32.430618 systemd[1]: var-lib-kubelet-pods-769bd715\x2d8e93\x2d4657\x2db3c2\x2dd5c87457529b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djxr8l.mount: Deactivated successfully. Sep 6 00:10:32.436609 kubelet[3215]: I0906 00:10:32.436514 3215 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "769bd715-8e93-4657-b3c2-d5c87457529b" (UID: "769bd715-8e93-4657-b3c2-d5c87457529b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 6 00:10:32.439079 systemd[1]: var-lib-kubelet-pods-769bd715\x2d8e93\x2d4657\x2db3c2\x2dd5c87457529b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 6 00:10:32.519559 kubelet[3215]: I0906 00:10:32.519423 3215 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxr8l\" (UniqueName: \"kubernetes.io/projected/769bd715-8e93-4657-b3c2-d5c87457529b-kube-api-access-jxr8l\") on node \"ip-172-31-26-146\" DevicePath \"\"" Sep 6 00:10:32.519559 kubelet[3215]: I0906 00:10:32.519475 3215 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-ca-bundle\") on node \"ip-172-31-26-146\" DevicePath \"\"" Sep 6 00:10:32.519559 kubelet[3215]: I0906 00:10:32.519499 3215 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/769bd715-8e93-4657-b3c2-d5c87457529b-whisker-backend-key-pair\") on node \"ip-172-31-26-146\" DevicePath \"\"" Sep 6 00:10:32.923510 systemd[1]: Removed slice kubepods-besteffort-pod769bd715_8e93_4657_b3c2_d5c87457529b.slice - libcontainer container kubepods-besteffort-pod769bd715_8e93_4657_b3c2_d5c87457529b.slice. Sep 6 00:10:33.051513 systemd[1]: Created slice kubepods-besteffort-pod50c13068_44d2_4d43_86c0_28af1e28a898.slice - libcontainer container kubepods-besteffort-pod50c13068_44d2_4d43_86c0_28af1e28a898.slice. Sep 6 00:10:33.123424 kubelet[3215]: I0906 00:10:33.123365 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7xq\" (UniqueName: \"kubernetes.io/projected/50c13068-44d2-4d43-86c0-28af1e28a898-kube-api-access-9b7xq\") pod \"whisker-b7dd89d5-zqd7v\" (UID: \"50c13068-44d2-4d43-86c0-28af1e28a898\") " pod="calico-system/whisker-b7dd89d5-zqd7v" Sep 6 00:10:33.124639 kubelet[3215]: I0906 00:10:33.124328 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/50c13068-44d2-4d43-86c0-28af1e28a898-whisker-backend-key-pair\") pod \"whisker-b7dd89d5-zqd7v\" (UID: \"50c13068-44d2-4d43-86c0-28af1e28a898\") " pod="calico-system/whisker-b7dd89d5-zqd7v" Sep 6 00:10:33.124639 kubelet[3215]: I0906 00:10:33.124486 3215 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50c13068-44d2-4d43-86c0-28af1e28a898-whisker-ca-bundle\") pod \"whisker-b7dd89d5-zqd7v\" (UID: \"50c13068-44d2-4d43-86c0-28af1e28a898\") " pod="calico-system/whisker-b7dd89d5-zqd7v" Sep 6 00:10:33.362720 containerd[2004]: time="2025-09-06T00:10:33.362644755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b7dd89d5-zqd7v,Uid:50c13068-44d2-4d43-86c0-28af1e28a898,Namespace:calico-system,Attempt:0,}" Sep 6 00:10:33.416007 kubelet[3215]: I0906 00:10:33.415650 3215 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769bd715-8e93-4657-b3c2-d5c87457529b" path="/var/lib/kubelet/pods/769bd715-8e93-4657-b3c2-d5c87457529b/volumes" Sep 6 00:10:33.586546 (udev-worker)[4638]: Network interface NamePolicy= disabled on kernel command line. Sep 6 00:10:33.589972 systemd-networkd[1845]: calif0eb8781880: Link UP Sep 6 00:10:33.591554 systemd-networkd[1845]: calif0eb8781880: Gained carrier Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.431 [INFO][4729] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.458 [INFO][4729] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0 whisker-b7dd89d5- calico-system 50c13068-44d2-4d43-86c0-28af1e28a898 917 0 2025-09-06 00:10:33 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:b7dd89d5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-26-146 whisker-b7dd89d5-zqd7v eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif0eb8781880 [] [] }} ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.458 [INFO][4729] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.512 [INFO][4742] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" HandleID="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Workload="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.513 [INFO][4742] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" HandleID="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Workload="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-146", "pod":"whisker-b7dd89d5-zqd7v", "timestamp":"2025-09-06 00:10:33.512895676 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.513 [INFO][4742] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.513 [INFO][4742] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.513 [INFO][4742] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.529 [INFO][4742] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.536 [INFO][4742] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.543 [INFO][4742] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.546 [INFO][4742] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.550 [INFO][4742] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.550 [INFO][4742] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.552 [INFO][4742] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291 Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.559 [INFO][4742] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.569 [INFO][4742] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.65/26] block=192.168.57.64/26 handle="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.570 [INFO][4742] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.65/26] handle="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" host="ip-172-31-26-146" Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.571 [INFO][4742] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:33.623900 containerd[2004]: 2025-09-06 00:10:33.571 [INFO][4742] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.65/26] IPv6=[] ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" HandleID="k8s-pod-network.89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Workload="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.574 [INFO][4729] cni-plugin/k8s.go 418: Populated endpoint ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0", GenerateName:"whisker-b7dd89d5-", Namespace:"calico-system", SelfLink:"", UID:"50c13068-44d2-4d43-86c0-28af1e28a898", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b7dd89d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"whisker-b7dd89d5-zqd7v", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.57.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif0eb8781880", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.574 [INFO][4729] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.65/32] ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.574 [INFO][4729] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0eb8781880 ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.593 [INFO][4729] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.593 [INFO][4729] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0", GenerateName:"whisker-b7dd89d5-", Namespace:"calico-system", SelfLink:"", UID:"50c13068-44d2-4d43-86c0-28af1e28a898", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b7dd89d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291", Pod:"whisker-b7dd89d5-zqd7v", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.57.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif0eb8781880", MAC:"12:93:94:eb:9c:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:33.626543 containerd[2004]: 2025-09-06 00:10:33.619 [INFO][4729] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291" Namespace="calico-system" Pod="whisker-b7dd89d5-zqd7v" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--b7dd89d5--zqd7v-eth0" Sep 6 00:10:33.665482 containerd[2004]: time="2025-09-06T00:10:33.665287685Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:33.666578 containerd[2004]: time="2025-09-06T00:10:33.665408357Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:33.668523 containerd[2004]: time="2025-09-06T00:10:33.668208833Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:33.668523 containerd[2004]: time="2025-09-06T00:10:33.668446013Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:33.734382 systemd[1]: Started cri-containerd-89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291.scope - libcontainer container 89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291. Sep 6 00:10:33.967936 containerd[2004]: time="2025-09-06T00:10:33.967133874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b7dd89d5-zqd7v,Uid:50c13068-44d2-4d43-86c0-28af1e28a898,Namespace:calico-system,Attempt:0,} returns sandbox id \"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291\"" Sep 6 00:10:33.977760 containerd[2004]: time="2025-09-06T00:10:33.977591550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 6 00:10:34.765277 kernel: bpftool[4913]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 6 00:10:34.865333 systemd-networkd[1845]: calif0eb8781880: Gained IPv6LL Sep 6 00:10:35.097861 systemd-networkd[1845]: vxlan.calico: Link UP Sep 6 00:10:35.097884 systemd-networkd[1845]: vxlan.calico: Gained carrier Sep 6 00:10:35.160200 (udev-worker)[4636]: Network interface NamePolicy= disabled on kernel command line. Sep 6 00:10:35.414690 containerd[2004]: time="2025-09-06T00:10:35.411644921Z" level=info msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.575 [INFO][4966] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.575 [INFO][4966] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" iface="eth0" netns="/var/run/netns/cni-f722aa30-e9aa-95d0-4d8c-d8eaeada4422" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.576 [INFO][4966] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" iface="eth0" netns="/var/run/netns/cni-f722aa30-e9aa-95d0-4d8c-d8eaeada4422" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.576 [INFO][4966] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" iface="eth0" netns="/var/run/netns/cni-f722aa30-e9aa-95d0-4d8c-d8eaeada4422" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.577 [INFO][4966] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.577 [INFO][4966] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.662 [INFO][4980] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.663 [INFO][4980] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.663 [INFO][4980] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.689 [WARNING][4980] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.689 [INFO][4980] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.694 [INFO][4980] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:35.707623 containerd[2004]: 2025-09-06 00:10:35.699 [INFO][4966] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:35.716977 systemd[1]: run-netns-cni\x2df722aa30\x2de9aa\x2d95d0\x2d4d8c\x2dd8eaeada4422.mount: Deactivated successfully. Sep 6 00:10:35.763060 containerd[2004]: time="2025-09-06T00:10:35.762988831Z" level=info msg="TearDown network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" successfully" Sep 6 00:10:35.763060 containerd[2004]: time="2025-09-06T00:10:35.763047691Z" level=info msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" returns successfully" Sep 6 00:10:35.765369 containerd[2004]: time="2025-09-06T00:10:35.764456971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wrl6g,Uid:e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01,Namespace:kube-system,Attempt:1,}" Sep 6 00:10:36.182862 systemd-networkd[1845]: cali0c2f9736f76: Link UP Sep 6 00:10:36.183262 systemd-networkd[1845]: cali0c2f9736f76: Gained carrier Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.915 [INFO][4991] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0 coredns-668d6bf9bc- kube-system e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01 928 0 2025-09-06 00:09:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-26-146 coredns-668d6bf9bc-wrl6g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0c2f9736f76 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.916 [INFO][4991] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.995 [INFO][5013] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" HandleID="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.995 [INFO][5013] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" HandleID="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400038cd60), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-26-146", "pod":"coredns-668d6bf9bc-wrl6g", "timestamp":"2025-09-06 00:10:35.995520764 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.995 [INFO][5013] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.995 [INFO][5013] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:35.996 [INFO][5013] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.030 [INFO][5013] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.040 [INFO][5013] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.129 [INFO][5013] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.135 [INFO][5013] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.140 [INFO][5013] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.140 [INFO][5013] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.145 [INFO][5013] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00 Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.154 [INFO][5013] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.167 [INFO][5013] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.66/26] block=192.168.57.64/26 handle="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.167 [INFO][5013] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.66/26] handle="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" host="ip-172-31-26-146" Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.168 [INFO][5013] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:36.222739 containerd[2004]: 2025-09-06 00:10:36.168 [INFO][5013] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.66/26] IPv6=[] ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" HandleID="k8s-pod-network.946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.174 [INFO][4991] cni-plugin/k8s.go 418: Populated endpoint ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"coredns-668d6bf9bc-wrl6g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c2f9736f76", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.175 [INFO][4991] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.66/32] ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.175 [INFO][4991] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c2f9736f76 ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.184 [INFO][4991] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.185 [INFO][4991] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00", Pod:"coredns-668d6bf9bc-wrl6g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c2f9736f76", MAC:"d2:a0:76:47:fc:86", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:36.227755 containerd[2004]: 2025-09-06 00:10:36.209 [INFO][4991] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00" Namespace="kube-system" Pod="coredns-668d6bf9bc-wrl6g" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:36.289692 containerd[2004]: time="2025-09-06T00:10:36.285579162Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:36.289692 containerd[2004]: time="2025-09-06T00:10:36.285678570Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:36.289692 containerd[2004]: time="2025-09-06T00:10:36.285723390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:36.289692 containerd[2004]: time="2025-09-06T00:10:36.285889158Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:36.346639 systemd[1]: Started cri-containerd-946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00.scope - libcontainer container 946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00. Sep 6 00:10:36.412252 containerd[2004]: time="2025-09-06T00:10:36.411589386Z" level=info msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" Sep 6 00:10:36.412252 containerd[2004]: time="2025-09-06T00:10:36.411817494Z" level=info msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" Sep 6 00:10:36.414257 containerd[2004]: time="2025-09-06T00:10:36.413639814Z" level=info msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" Sep 6 00:10:36.421785 containerd[2004]: time="2025-09-06T00:10:36.414815778Z" level=info msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" Sep 6 00:10:36.466978 systemd-networkd[1845]: vxlan.calico: Gained IPv6LL Sep 6 00:10:36.527131 containerd[2004]: time="2025-09-06T00:10:36.527052499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wrl6g,Uid:e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01,Namespace:kube-system,Attempt:1,} returns sandbox id \"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00\"" Sep 6 00:10:36.566180 containerd[2004]: time="2025-09-06T00:10:36.564611947Z" level=info msg="CreateContainer within sandbox \"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 00:10:36.614467 containerd[2004]: time="2025-09-06T00:10:36.614408023Z" level=info msg="CreateContainer within sandbox \"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9bcf26afda0698a0a8070dfd70c6f7a7e2b48e6aa9f810fec5fbb58157141bb1\"" Sep 6 00:10:36.617185 containerd[2004]: time="2025-09-06T00:10:36.615823411Z" level=info msg="StartContainer for \"9bcf26afda0698a0a8070dfd70c6f7a7e2b48e6aa9f810fec5fbb58157141bb1\"" Sep 6 00:10:36.876087 systemd[1]: Started cri-containerd-9bcf26afda0698a0a8070dfd70c6f7a7e2b48e6aa9f810fec5fbb58157141bb1.scope - libcontainer container 9bcf26afda0698a0a8070dfd70c6f7a7e2b48e6aa9f810fec5fbb58157141bb1. Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.716 [INFO][5119] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.718 [INFO][5119] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" iface="eth0" netns="/var/run/netns/cni-de261a9d-6fcb-184c-1ade-4e8a83c467c7" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.718 [INFO][5119] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" iface="eth0" netns="/var/run/netns/cni-de261a9d-6fcb-184c-1ade-4e8a83c467c7" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.718 [INFO][5119] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" iface="eth0" netns="/var/run/netns/cni-de261a9d-6fcb-184c-1ade-4e8a83c467c7" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.718 [INFO][5119] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.719 [INFO][5119] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.982 [INFO][5151] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.988 [INFO][5151] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:36.988 [INFO][5151] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:37.029 [WARNING][5151] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:37.030 [INFO][5151] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:37.033 [INFO][5151] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:37.052438 containerd[2004]: 2025-09-06 00:10:37.047 [INFO][5119] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:37.054893 containerd[2004]: time="2025-09-06T00:10:37.054346398Z" level=info msg="TearDown network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" successfully" Sep 6 00:10:37.054893 containerd[2004]: time="2025-09-06T00:10:37.054392610Z" level=info msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" returns successfully" Sep 6 00:10:37.061490 systemd[1]: run-netns-cni\x2dde261a9d\x2d6fcb\x2d184c\x2d1ade\x2d4e8a83c467c7.mount: Deactivated successfully. Sep 6 00:10:37.087235 containerd[2004]: time="2025-09-06T00:10:37.086722614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fdn4p,Uid:b640f503-9253-4063-8780-3c12db4c2e29,Namespace:calico-system,Attempt:1,}" Sep 6 00:10:37.097990 containerd[2004]: time="2025-09-06T00:10:37.097854510Z" level=info msg="StartContainer for \"9bcf26afda0698a0a8070dfd70c6f7a7e2b48e6aa9f810fec5fbb58157141bb1\" returns successfully" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.909 [INFO][5129] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.912 [INFO][5129] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" iface="eth0" netns="/var/run/netns/cni-2739b9d8-ab38-4d32-6e99-88a2f91ef30f" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.914 [INFO][5129] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" iface="eth0" netns="/var/run/netns/cni-2739b9d8-ab38-4d32-6e99-88a2f91ef30f" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.915 [INFO][5129] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" iface="eth0" netns="/var/run/netns/cni-2739b9d8-ab38-4d32-6e99-88a2f91ef30f" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.916 [INFO][5129] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:36.916 [INFO][5129] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.143 [INFO][5183] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.147 [INFO][5183] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.147 [INFO][5183] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.178 [WARNING][5183] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.179 [INFO][5183] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.187 [INFO][5183] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:37.199472 containerd[2004]: 2025-09-06 00:10:37.195 [INFO][5129] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:10:37.206210 containerd[2004]: time="2025-09-06T00:10:37.203391054Z" level=info msg="TearDown network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" successfully" Sep 6 00:10:37.206210 containerd[2004]: time="2025-09-06T00:10:37.203442762Z" level=info msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" returns successfully" Sep 6 00:10:37.206210 containerd[2004]: time="2025-09-06T00:10:37.204269250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-8hj4r,Uid:6141a3f9-cce3-4c89-a723-17bc0b75f562,Namespace:calico-apiserver,Attempt:1,}" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.872 [INFO][5117] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.891 [INFO][5117] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" iface="eth0" netns="/var/run/netns/cni-cf10f127-3f5b-7d7d-a81b-01291a08a83c" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.892 [INFO][5117] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" iface="eth0" netns="/var/run/netns/cni-cf10f127-3f5b-7d7d-a81b-01291a08a83c" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.900 [INFO][5117] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" iface="eth0" netns="/var/run/netns/cni-cf10f127-3f5b-7d7d-a81b-01291a08a83c" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.901 [INFO][5117] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:36.901 [INFO][5117] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.165 [INFO][5175] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.166 [INFO][5175] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.187 [INFO][5175] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.213 [WARNING][5175] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.213 [INFO][5175] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.217 [INFO][5175] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:37.234077 containerd[2004]: 2025-09-06 00:10:37.226 [INFO][5117] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:10:37.240994 containerd[2004]: time="2025-09-06T00:10:37.240841867Z" level=info msg="TearDown network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" successfully" Sep 6 00:10:37.241236 containerd[2004]: time="2025-09-06T00:10:37.241201111Z" level=info msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" returns successfully" Sep 6 00:10:37.242543 containerd[2004]: time="2025-09-06T00:10:37.242456743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:37.244960 containerd[2004]: time="2025-09-06T00:10:37.244889059Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 6 00:10:37.248568 containerd[2004]: time="2025-09-06T00:10:37.248232439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-5528p,Uid:9733554c-54e5-4004-bf21-b3d55e01bf9c,Namespace:calico-apiserver,Attempt:1,}" Sep 6 00:10:37.262295 containerd[2004]: time="2025-09-06T00:10:37.262231003Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.928 [INFO][5128] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.930 [INFO][5128] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" iface="eth0" netns="/var/run/netns/cni-60dd3f23-a414-b05b-e3ca-3ac593bdc6f6" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.931 [INFO][5128] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" iface="eth0" netns="/var/run/netns/cni-60dd3f23-a414-b05b-e3ca-3ac593bdc6f6" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.932 [INFO][5128] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" iface="eth0" netns="/var/run/netns/cni-60dd3f23-a414-b05b-e3ca-3ac593bdc6f6" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.933 [INFO][5128] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:36.933 [INFO][5128] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.194 [INFO][5188] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.194 [INFO][5188] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.217 [INFO][5188] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.254 [WARNING][5188] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.254 [INFO][5188] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.263 [INFO][5188] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:37.292086 containerd[2004]: 2025-09-06 00:10:37.275 [INFO][5128] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:10:37.294051 containerd[2004]: time="2025-09-06T00:10:37.293333071Z" level=info msg="TearDown network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" successfully" Sep 6 00:10:37.294051 containerd[2004]: time="2025-09-06T00:10:37.293396887Z" level=info msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" returns successfully" Sep 6 00:10:37.297025 containerd[2004]: time="2025-09-06T00:10:37.296604895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cf8b65958-8qctx,Uid:aeae33f4-c2c1-4f17-a617-26feec2636ec,Namespace:calico-system,Attempt:1,}" Sep 6 00:10:37.299220 systemd-networkd[1845]: cali0c2f9736f76: Gained IPv6LL Sep 6 00:10:37.321859 containerd[2004]: time="2025-09-06T00:10:37.321781987Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:37.347373 containerd[2004]: time="2025-09-06T00:10:37.345108355Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 3.367451297s" Sep 6 00:10:37.349119 containerd[2004]: time="2025-09-06T00:10:37.347338003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 6 00:10:37.383473 containerd[2004]: time="2025-09-06T00:10:37.383374279Z" level=info msg="CreateContainer within sandbox \"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 6 00:10:37.432843 containerd[2004]: time="2025-09-06T00:10:37.431870792Z" level=info msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" Sep 6 00:10:37.541178 containerd[2004]: time="2025-09-06T00:10:37.540639044Z" level=info msg="CreateContainer within sandbox \"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"47945789c9f6a9e700a10e64b244657adab589e58f477c7710ba3ccabf0d6faa\"" Sep 6 00:10:37.544992 containerd[2004]: time="2025-09-06T00:10:37.543007904Z" level=info msg="StartContainer for \"47945789c9f6a9e700a10e64b244657adab589e58f477c7710ba3ccabf0d6faa\"" Sep 6 00:10:37.739581 systemd[1]: run-netns-cni\x2d2739b9d8\x2dab38\x2d4d32\x2d6e99\x2d88a2f91ef30f.mount: Deactivated successfully. Sep 6 00:10:37.739792 systemd[1]: run-netns-cni\x2dcf10f127\x2d3f5b\x2d7d7d\x2da81b\x2d01291a08a83c.mount: Deactivated successfully. Sep 6 00:10:37.739937 systemd[1]: run-netns-cni\x2d60dd3f23\x2da414\x2db05b\x2de3ca\x2d3ac593bdc6f6.mount: Deactivated successfully. Sep 6 00:10:37.813123 systemd[1]: Started cri-containerd-47945789c9f6a9e700a10e64b244657adab589e58f477c7710ba3ccabf0d6faa.scope - libcontainer container 47945789c9f6a9e700a10e64b244657adab589e58f477c7710ba3ccabf0d6faa. Sep 6 00:10:37.956497 systemd-networkd[1845]: calid6b47a97bf2: Link UP Sep 6 00:10:37.963506 systemd-networkd[1845]: calid6b47a97bf2: Gained carrier Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.398 [INFO][5205] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0 goldmane-54d579b49d- calico-system b640f503-9253-4063-8780-3c12db4c2e29 941 0 2025-09-06 00:10:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-26-146 goldmane-54d579b49d-fdn4p eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid6b47a97bf2 [] [] }} ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.400 [INFO][5205] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.627 [INFO][5259] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" HandleID="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.627 [INFO][5259] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" HandleID="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3530), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-146", "pod":"goldmane-54d579b49d-fdn4p", "timestamp":"2025-09-06 00:10:37.627505292 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.627 [INFO][5259] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.627 [INFO][5259] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.627 [INFO][5259] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.657 [INFO][5259] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.685 [INFO][5259] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.787 [INFO][5259] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.799 [INFO][5259] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.838 [INFO][5259] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.838 [INFO][5259] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.854 [INFO][5259] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67 Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.902 [INFO][5259] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.922 [INFO][5259] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.67/26] block=192.168.57.64/26 handle="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.923 [INFO][5259] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.67/26] handle="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" host="ip-172-31-26-146" Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.923 [INFO][5259] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:38.031632 containerd[2004]: 2025-09-06 00:10:37.923 [INFO][5259] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.67/26] IPv6=[] ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" HandleID="k8s-pod-network.e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:37.930 [INFO][5205] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"b640f503-9253-4063-8780-3c12db4c2e29", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"goldmane-54d579b49d-fdn4p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.57.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid6b47a97bf2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:37.933 [INFO][5205] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.67/32] ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:37.935 [INFO][5205] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6b47a97bf2 ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:37.960 [INFO][5205] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:37.962 [INFO][5205] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"b640f503-9253-4063-8780-3c12db4c2e29", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67", Pod:"goldmane-54d579b49d-fdn4p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.57.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid6b47a97bf2", MAC:"52:c3:3b:84:af:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.035811 containerd[2004]: 2025-09-06 00:10:38.026 [INFO][5205] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67" Namespace="calico-system" Pod="goldmane-54d579b49d-fdn4p" WorkloadEndpoint="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:38.147276 kubelet[3215]: I0906 00:10:38.146660 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-wrl6g" podStartSLOduration=53.146628787 podStartE2EDuration="53.146628787s" podCreationTimestamp="2025-09-06 00:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:10:38.143555167 +0000 UTC m=+59.227597999" watchObservedRunningTime="2025-09-06 00:10:38.146628787 +0000 UTC m=+59.230671559" Sep 6 00:10:38.201940 containerd[2004]: time="2025-09-06T00:10:38.201477187Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:38.201940 containerd[2004]: time="2025-09-06T00:10:38.201775063Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:38.204537 containerd[2004]: time="2025-09-06T00:10:38.201895159Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.204537 containerd[2004]: time="2025-09-06T00:10:38.204375367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.291861 systemd-networkd[1845]: calia09119fc40b: Link UP Sep 6 00:10:38.297823 systemd-networkd[1845]: calia09119fc40b: Gained carrier Sep 6 00:10:38.326504 systemd[1]: Started cri-containerd-e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67.scope - libcontainer container e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67. Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.582 [INFO][5222] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0 calico-apiserver-8966f54db- calico-apiserver 6141a3f9-cce3-4c89-a723-17bc0b75f562 943 0 2025-09-06 00:10:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8966f54db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-26-146 calico-apiserver-8966f54db-8hj4r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia09119fc40b [] [] }} ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.582 [INFO][5222] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.983 [INFO][5287] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" HandleID="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.983 [INFO][5287] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" HandleID="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000343330), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-26-146", "pod":"calico-apiserver-8966f54db-8hj4r", "timestamp":"2025-09-06 00:10:37.983503174 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.984 [INFO][5287] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.984 [INFO][5287] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:37.989 [INFO][5287] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.044 [INFO][5287] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.089 [INFO][5287] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.113 [INFO][5287] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.126 [INFO][5287] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.137 [INFO][5287] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.138 [INFO][5287] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.192 [INFO][5287] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.204 [INFO][5287] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.254 [INFO][5287] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.68/26] block=192.168.57.64/26 handle="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.255 [INFO][5287] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.68/26] handle="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" host="ip-172-31-26-146" Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.255 [INFO][5287] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:38.390924 containerd[2004]: 2025-09-06 00:10:38.255 [INFO][5287] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.68/26] IPv6=[] ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" HandleID="k8s-pod-network.b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.274 [INFO][5222] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6141a3f9-cce3-4c89-a723-17bc0b75f562", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"calico-apiserver-8966f54db-8hj4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia09119fc40b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.274 [INFO][5222] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.68/32] ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.274 [INFO][5222] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia09119fc40b ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.297 [INFO][5222] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.302 [INFO][5222] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6141a3f9-cce3-4c89-a723-17bc0b75f562", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab", Pod:"calico-apiserver-8966f54db-8hj4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia09119fc40b", MAC:"8e:19:ba:ce:97:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.393470 containerd[2004]: 2025-09-06 00:10:38.382 [INFO][5222] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-8hj4r" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:10:38.459798 containerd[2004]: time="2025-09-06T00:10:38.458619141Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:38.459798 containerd[2004]: time="2025-09-06T00:10:38.458734905Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:38.459798 containerd[2004]: time="2025-09-06T00:10:38.458783385Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.459798 containerd[2004]: time="2025-09-06T00:10:38.458976633Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.501658 systemd-networkd[1845]: calia8ac445ca91: Link UP Sep 6 00:10:38.502073 systemd-networkd[1845]: calia8ac445ca91: Gained carrier Sep 6 00:10:38.574582 systemd[1]: Started cri-containerd-b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab.scope - libcontainer container b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab. Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:37.933 [INFO][5243] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0 calico-kube-controllers-5cf8b65958- calico-system aeae33f4-c2c1-4f17-a617-26feec2636ec 944 0 2025-09-06 00:10:11 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5cf8b65958 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-26-146 calico-kube-controllers-5cf8b65958-8qctx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia8ac445ca91 [] [] }} ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:37.935 [INFO][5243] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.181 [INFO][5322] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" HandleID="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.181 [INFO][5322] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" HandleID="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3270), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-146", "pod":"calico-kube-controllers-5cf8b65958-8qctx", "timestamp":"2025-09-06 00:10:38.181530127 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.182 [INFO][5322] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.260 [INFO][5322] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.262 [INFO][5322] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.364 [INFO][5322] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.387 [INFO][5322] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.399 [INFO][5322] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.405 [INFO][5322] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.413 [INFO][5322] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.414 [INFO][5322] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.419 [INFO][5322] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.433 [INFO][5322] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.463 [INFO][5322] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.69/26] block=192.168.57.64/26 handle="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.463 [INFO][5322] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.69/26] handle="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" host="ip-172-31-26-146" Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.463 [INFO][5322] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:38.581901 containerd[2004]: 2025-09-06 00:10:38.463 [INFO][5322] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.69/26] IPv6=[] ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" HandleID="k8s-pod-network.fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.476 [INFO][5243] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0", GenerateName:"calico-kube-controllers-5cf8b65958-", Namespace:"calico-system", SelfLink:"", UID:"aeae33f4-c2c1-4f17-a617-26feec2636ec", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cf8b65958", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"calico-kube-controllers-5cf8b65958-8qctx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia8ac445ca91", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.479 [INFO][5243] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.69/32] ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.479 [INFO][5243] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8ac445ca91 ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.519 [INFO][5243] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.529 [INFO][5243] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0", GenerateName:"calico-kube-controllers-5cf8b65958-", Namespace:"calico-system", SelfLink:"", UID:"aeae33f4-c2c1-4f17-a617-26feec2636ec", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cf8b65958", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee", Pod:"calico-kube-controllers-5cf8b65958-8qctx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia8ac445ca91", MAC:"42:01:88:c0:dc:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.583050 containerd[2004]: 2025-09-06 00:10:38.566 [INFO][5243] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee" Namespace="calico-system" Pod="calico-kube-controllers-5cf8b65958-8qctx" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:10:38.651857 containerd[2004]: time="2025-09-06T00:10:38.651669958Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:38.651857 containerd[2004]: time="2025-09-06T00:10:38.651767986Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:38.651857 containerd[2004]: time="2025-09-06T00:10:38.651795430Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.653384 containerd[2004]: time="2025-09-06T00:10:38.651941758Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.720352 systemd-networkd[1845]: caliaea676958aa: Link UP Sep 6 00:10:38.730885 systemd-networkd[1845]: caliaea676958aa: Gained carrier Sep 6 00:10:38.755285 systemd[1]: Started cri-containerd-fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee.scope - libcontainer container fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee. Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.008 [INFO][5268] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.021 [INFO][5268] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" iface="eth0" netns="/var/run/netns/cni-0f49c827-9fe4-55af-d5a6-d9a14357a431" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.021 [INFO][5268] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" iface="eth0" netns="/var/run/netns/cni-0f49c827-9fe4-55af-d5a6-d9a14357a431" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.022 [INFO][5268] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" iface="eth0" netns="/var/run/netns/cni-0f49c827-9fe4-55af-d5a6-d9a14357a431" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.022 [INFO][5268] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.027 [INFO][5268] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.386 [INFO][5329] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.386 [INFO][5329] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.679 [INFO][5329] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.762 [WARNING][5329] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.762 [INFO][5329] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.769 [INFO][5329] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:38.779206 containerd[2004]: 2025-09-06 00:10:38.772 [INFO][5268] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:10:38.781641 containerd[2004]: time="2025-09-06T00:10:38.781422070Z" level=info msg="TearDown network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" successfully" Sep 6 00:10:38.781641 containerd[2004]: time="2025-09-06T00:10:38.781473274Z" level=info msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" returns successfully" Sep 6 00:10:38.785540 containerd[2004]: time="2025-09-06T00:10:38.785476486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47vdk,Uid:44f64470-7851-42aa-8ed3-eb71e8151f7c,Namespace:calico-system,Attempt:1,}" Sep 6 00:10:38.796333 systemd[1]: run-netns-cni\x2d0f49c827\x2d9fe4\x2d55af\x2dd5a6\x2dd9a14357a431.mount: Deactivated successfully. Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:37.768 [INFO][5231] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0 calico-apiserver-8966f54db- calico-apiserver 9733554c-54e5-4004-bf21-b3d55e01bf9c 942 0 2025-09-06 00:10:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8966f54db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-26-146 calico-apiserver-8966f54db-5528p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliaea676958aa [] [] }} ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:37.773 [INFO][5231] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.320 [INFO][5309] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" HandleID="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.321 [INFO][5309] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" HandleID="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400025c130), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-26-146", "pod":"calico-apiserver-8966f54db-5528p", "timestamp":"2025-09-06 00:10:38.317610788 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.321 [INFO][5309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.466 [INFO][5309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.467 [INFO][5309] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.527 [INFO][5309] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.558 [INFO][5309] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.592 [INFO][5309] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.603 [INFO][5309] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.624 [INFO][5309] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.625 [INFO][5309] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.638 [INFO][5309] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028 Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.656 [INFO][5309] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.678 [INFO][5309] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.70/26] block=192.168.57.64/26 handle="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.679 [INFO][5309] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.70/26] handle="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" host="ip-172-31-26-146" Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.679 [INFO][5309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:38.800556 containerd[2004]: 2025-09-06 00:10:38.679 [INFO][5309] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.70/26] IPv6=[] ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" HandleID="k8s-pod-network.75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.700 [INFO][5231] cni-plugin/k8s.go 418: Populated endpoint ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"9733554c-54e5-4004-bf21-b3d55e01bf9c", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"calico-apiserver-8966f54db-5528p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaea676958aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.700 [INFO][5231] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.70/32] ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.702 [INFO][5231] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaea676958aa ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.724 [INFO][5231] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.737 [INFO][5231] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"9733554c-54e5-4004-bf21-b3d55e01bf9c", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028", Pod:"calico-apiserver-8966f54db-5528p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaea676958aa", MAC:"a2:db:08:94:c1:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:38.803892 containerd[2004]: 2025-09-06 00:10:38.793 [INFO][5231] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028" Namespace="calico-apiserver" Pod="calico-apiserver-8966f54db-5528p" WorkloadEndpoint="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:10:38.929275 containerd[2004]: time="2025-09-06T00:10:38.926892791Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:38.929275 containerd[2004]: time="2025-09-06T00:10:38.926988875Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:38.929275 containerd[2004]: time="2025-09-06T00:10:38.927033515Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:38.929768 containerd[2004]: time="2025-09-06T00:10:38.927241415Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:39.047108 systemd[1]: Started cri-containerd-75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028.scope - libcontainer container 75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028. Sep 6 00:10:39.096119 containerd[2004]: time="2025-09-06T00:10:39.095935508Z" level=info msg="StartContainer for \"47945789c9f6a9e700a10e64b244657adab589e58f477c7710ba3ccabf0d6faa\" returns successfully" Sep 6 00:10:39.102265 containerd[2004]: time="2025-09-06T00:10:39.102133556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 6 00:10:39.303699 containerd[2004]: time="2025-09-06T00:10:39.303438189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fdn4p,Uid:b640f503-9253-4063-8780-3c12db4c2e29,Namespace:calico-system,Attempt:1,} returns sandbox id \"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67\"" Sep 6 00:10:39.346912 systemd-networkd[1845]: calid6b47a97bf2: Gained IPv6LL Sep 6 00:10:39.414273 containerd[2004]: time="2025-09-06T00:10:39.412257081Z" level=info msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" Sep 6 00:10:39.471858 systemd-networkd[1845]: cali14200006449: Link UP Sep 6 00:10:39.474519 systemd-networkd[1845]: calia09119fc40b: Gained IPv6LL Sep 6 00:10:39.485388 systemd-networkd[1845]: cali14200006449: Gained carrier Sep 6 00:10:39.505509 containerd[2004]: time="2025-09-06T00:10:39.504471466Z" level=info msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.131 [INFO][5484] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0 csi-node-driver- calico-system 44f64470-7851-42aa-8ed3-eb71e8151f7c 955 0 2025-09-06 00:10:11 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-26-146 csi-node-driver-47vdk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali14200006449 [] [] }} ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.132 [INFO][5484] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.300 [INFO][5535] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" HandleID="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.300 [INFO][5535] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" HandleID="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000341440), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-146", "pod":"csi-node-driver-47vdk", "timestamp":"2025-09-06 00:10:39.300321573 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.300 [INFO][5535] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.300 [INFO][5535] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.300 [INFO][5535] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.349 [INFO][5535] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.367 [INFO][5535] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.375 [INFO][5535] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.379 [INFO][5535] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.386 [INFO][5535] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.387 [INFO][5535] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.390 [INFO][5535] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666 Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.399 [INFO][5535] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.419 [INFO][5535] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.71/26] block=192.168.57.64/26 handle="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.422 [INFO][5535] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.71/26] handle="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" host="ip-172-31-26-146" Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.422 [INFO][5535] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:39.572541 containerd[2004]: 2025-09-06 00:10:39.423 [INFO][5535] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.71/26] IPv6=[] ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" HandleID="k8s-pod-network.e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.455 [INFO][5484] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"44f64470-7851-42aa-8ed3-eb71e8151f7c", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"csi-node-driver-47vdk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali14200006449", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.455 [INFO][5484] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.71/32] ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.455 [INFO][5484] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14200006449 ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.490 [INFO][5484] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.509 [INFO][5484] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"44f64470-7851-42aa-8ed3-eb71e8151f7c", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666", Pod:"csi-node-driver-47vdk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali14200006449", MAC:"12:dd:2d:60:3c:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:39.575908 containerd[2004]: 2025-09-06 00:10:39.559 [INFO][5484] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666" Namespace="calico-system" Pod="csi-node-driver-47vdk" WorkloadEndpoint="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:10:39.622449 containerd[2004]: time="2025-09-06T00:10:39.622275418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-8hj4r,Uid:6141a3f9-cce3-4c89-a723-17bc0b75f562,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab\"" Sep 6 00:10:39.732727 containerd[2004]: time="2025-09-06T00:10:39.727005755Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:39.732727 containerd[2004]: time="2025-09-06T00:10:39.727091699Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:39.732727 containerd[2004]: time="2025-09-06T00:10:39.727116767Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:39.732727 containerd[2004]: time="2025-09-06T00:10:39.727275359Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:39.806270 containerd[2004]: time="2025-09-06T00:10:39.804961727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cf8b65958-8qctx,Uid:aeae33f4-c2c1-4f17-a617-26feec2636ec,Namespace:calico-system,Attempt:1,} returns sandbox id \"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee\"" Sep 6 00:10:39.876444 containerd[2004]: time="2025-09-06T00:10:39.875918988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8966f54db-5528p,Uid:9733554c-54e5-4004-bf21-b3d55e01bf9c,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028\"" Sep 6 00:10:39.885544 systemd[1]: Started cri-containerd-e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666.scope - libcontainer container e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666. Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.787 [INFO][5570] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.790 [INFO][5570] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" iface="eth0" netns="/var/run/netns/cni-d2de29c3-396b-8410-b2c4-ae136bdb993e" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.793 [INFO][5570] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" iface="eth0" netns="/var/run/netns/cni-d2de29c3-396b-8410-b2c4-ae136bdb993e" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.796 [INFO][5570] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" iface="eth0" netns="/var/run/netns/cni-d2de29c3-396b-8410-b2c4-ae136bdb993e" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.796 [INFO][5570] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.796 [INFO][5570] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.965 [INFO][5644] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.965 [INFO][5644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.965 [INFO][5644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.993 [WARNING][5644] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.993 [INFO][5644] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.995 [INFO][5644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.003623 containerd[2004]: 2025-09-06 00:10:39.998 [INFO][5570] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:10:40.012052 containerd[2004]: time="2025-09-06T00:10:40.006859988Z" level=info msg="TearDown network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" successfully" Sep 6 00:10:40.012052 containerd[2004]: time="2025-09-06T00:10:40.006917024Z" level=info msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" returns successfully" Sep 6 00:10:40.012052 containerd[2004]: time="2025-09-06T00:10:40.011527520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cfg8t,Uid:b8081eb5-874c-4704-87a1-8d99ed0c3d28,Namespace:kube-system,Attempt:1,}" Sep 6 00:10:40.009763 systemd[1]: run-netns-cni\x2dd2de29c3\x2d396b\x2d8410\x2db2c4\x2dae136bdb993e.mount: Deactivated successfully. Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:39.824 [WARNING][5589] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01", ResourceVersion:"963", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00", Pod:"coredns-668d6bf9bc-wrl6g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c2f9736f76", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:39.824 [INFO][5589] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:39.824 [INFO][5589] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" iface="eth0" netns="" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:39.824 [INFO][5589] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:39.824 [INFO][5589] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.000 [INFO][5646] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.001 [INFO][5646] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.001 [INFO][5646] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.033 [WARNING][5646] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.033 [INFO][5646] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.042 [INFO][5646] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.059780 containerd[2004]: 2025-09-06 00:10:40.050 [INFO][5589] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.061764 containerd[2004]: time="2025-09-06T00:10:40.060853233Z" level=info msg="TearDown network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" successfully" Sep 6 00:10:40.061764 containerd[2004]: time="2025-09-06T00:10:40.061056585Z" level=info msg="StopPodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" returns successfully" Sep 6 00:10:40.064831 containerd[2004]: time="2025-09-06T00:10:40.064778937Z" level=info msg="RemovePodSandbox for \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" Sep 6 00:10:40.065875 containerd[2004]: time="2025-09-06T00:10:40.065822661Z" level=info msg="Forcibly stopping sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\"" Sep 6 00:10:40.161279 containerd[2004]: time="2025-09-06T00:10:40.160958757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47vdk,Uid:44f64470-7851-42aa-8ed3-eb71e8151f7c,Namespace:calico-system,Attempt:1,} returns sandbox id \"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666\"" Sep 6 00:10:40.369726 systemd-networkd[1845]: caliaea676958aa: Gained IPv6LL Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.243 [WARNING][5693] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e1a579ff-cf0a-431d-b0c4-8c9feaa7cd01", ResourceVersion:"963", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"946035e753a09e06add82f2dee38c43aad6968187d49665ca54b9b961805ee00", Pod:"coredns-668d6bf9bc-wrl6g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0c2f9736f76", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.243 [INFO][5693] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.243 [INFO][5693] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" iface="eth0" netns="" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.248 [INFO][5693] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.248 [INFO][5693] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.343 [INFO][5704] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.344 [INFO][5704] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.350 [INFO][5704] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.386 [WARNING][5704] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.387 [INFO][5704] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" HandleID="k8s-pod-network.f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--wrl6g-eth0" Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.393 [INFO][5704] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.411104 containerd[2004]: 2025-09-06 00:10:40.402 [INFO][5693] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf" Sep 6 00:10:40.414989 containerd[2004]: time="2025-09-06T00:10:40.410948314Z" level=info msg="TearDown network for sandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" successfully" Sep 6 00:10:40.434021 systemd-networkd[1845]: calia8ac445ca91: Gained IPv6LL Sep 6 00:10:40.444944 containerd[2004]: time="2025-09-06T00:10:40.444617662Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:10:40.444944 containerd[2004]: time="2025-09-06T00:10:40.444735286Z" level=info msg="RemovePodSandbox \"f14f801b6eb3fca92b0c97c492b26a71c945631de32cb79af00b84ad51d98aaf\" returns successfully" Sep 6 00:10:40.448728 containerd[2004]: time="2025-09-06T00:10:40.448654979Z" level=info msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" Sep 6 00:10:40.576866 systemd-networkd[1845]: calib8952826cc4: Link UP Sep 6 00:10:40.580417 systemd-networkd[1845]: calib8952826cc4: Gained carrier Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.247 [INFO][5670] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0 coredns-668d6bf9bc- kube-system b8081eb5-874c-4704-87a1-8d99ed0c3d28 985 0 2025-09-06 00:09:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-26-146 coredns-668d6bf9bc-cfg8t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib8952826cc4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.247 [INFO][5670] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.435 [INFO][5706] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" HandleID="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.436 [INFO][5706] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" HandleID="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001dfac0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-26-146", "pod":"coredns-668d6bf9bc-cfg8t", "timestamp":"2025-09-06 00:10:40.43550485 +0000 UTC"}, Hostname:"ip-172-31-26-146", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.437 [INFO][5706] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.437 [INFO][5706] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.437 [INFO][5706] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-146' Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.489 [INFO][5706] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.499 [INFO][5706] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.509 [INFO][5706] ipam/ipam.go 511: Trying affinity for 192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.513 [INFO][5706] ipam/ipam.go 158: Attempting to load block cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.519 [INFO][5706] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.57.64/26 host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.520 [INFO][5706] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.57.64/26 handle="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.524 [INFO][5706] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1 Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.535 [INFO][5706] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.57.64/26 handle="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.555 [INFO][5706] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.57.72/26] block=192.168.57.64/26 handle="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.556 [INFO][5706] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.57.72/26] handle="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" host="ip-172-31-26-146" Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.556 [INFO][5706] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.614433 containerd[2004]: 2025-09-06 00:10:40.556 [INFO][5706] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.72/26] IPv6=[] ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" HandleID="k8s-pod-network.7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.562 [INFO][5670] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b8081eb5-874c-4704-87a1-8d99ed0c3d28", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"", Pod:"coredns-668d6bf9bc-cfg8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8952826cc4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.562 [INFO][5670] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.57.72/32] ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.562 [INFO][5670] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8952826cc4 ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.587 [INFO][5670] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.590 [INFO][5670] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b8081eb5-874c-4704-87a1-8d99ed0c3d28", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1", Pod:"coredns-668d6bf9bc-cfg8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8952826cc4", MAC:"b6:1d:5c:c1:ee:26", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:40.616616 containerd[2004]: 2025-09-06 00:10:40.608 [INFO][5670] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1" Namespace="kube-system" Pod="coredns-668d6bf9bc-cfg8t" WorkloadEndpoint="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.562 [WARNING][5726] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.562 [INFO][5726] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.562 [INFO][5726] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" iface="eth0" netns="" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.563 [INFO][5726] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.563 [INFO][5726] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.660 [INFO][5735] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.662 [INFO][5735] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.662 [INFO][5735] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.676 [WARNING][5735] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.677 [INFO][5735] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.683 [INFO][5735] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.693306 containerd[2004]: 2025-09-06 00:10:40.688 [INFO][5726] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.693306 containerd[2004]: time="2025-09-06T00:10:40.693240684Z" level=info msg="TearDown network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" successfully" Sep 6 00:10:40.693306 containerd[2004]: time="2025-09-06T00:10:40.693279900Z" level=info msg="StopPodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" returns successfully" Sep 6 00:10:40.698037 containerd[2004]: time="2025-09-06T00:10:40.693898392Z" level=info msg="RemovePodSandbox for \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" Sep 6 00:10:40.698037 containerd[2004]: time="2025-09-06T00:10:40.693944628Z" level=info msg="Forcibly stopping sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\"" Sep 6 00:10:40.698834 containerd[2004]: time="2025-09-06T00:10:40.697889052Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:10:40.699514 containerd[2004]: time="2025-09-06T00:10:40.698738664Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:10:40.699514 containerd[2004]: time="2025-09-06T00:10:40.699003336Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:40.699942 containerd[2004]: time="2025-09-06T00:10:40.699790200Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:10:40.766508 systemd[1]: Started cri-containerd-7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1.scope - libcontainer container 7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1. Sep 6 00:10:40.902992 containerd[2004]: time="2025-09-06T00:10:40.902927425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cfg8t,Uid:b8081eb5-874c-4704-87a1-8d99ed0c3d28,Namespace:kube-system,Attempt:1,} returns sandbox id \"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1\"" Sep 6 00:10:40.912542 containerd[2004]: time="2025-09-06T00:10:40.912474097Z" level=info msg="CreateContainer within sandbox \"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.828 [WARNING][5778] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" WorkloadEndpoint="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.831 [INFO][5778] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.831 [INFO][5778] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" iface="eth0" netns="" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.832 [INFO][5778] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.832 [INFO][5778] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.920 [INFO][5801] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.920 [INFO][5801] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.920 [INFO][5801] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.948 [WARNING][5801] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.948 [INFO][5801] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" HandleID="k8s-pod-network.4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Workload="ip--172--31--26--146-k8s-whisker--7c8bff6bcd--nk2q2-eth0" Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.954 [INFO][5801] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:40.968213 containerd[2004]: 2025-09-06 00:10:40.958 [INFO][5778] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57" Sep 6 00:10:40.968213 containerd[2004]: time="2025-09-06T00:10:40.967372249Z" level=info msg="TearDown network for sandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" successfully" Sep 6 00:10:40.972895 containerd[2004]: time="2025-09-06T00:10:40.972436201Z" level=info msg="CreateContainer within sandbox \"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f1ac0579532a29d2d128350929c820f9c581678ef808cd8d11b9c5f38cfb4c80\"" Sep 6 00:10:40.975926 containerd[2004]: time="2025-09-06T00:10:40.974613589Z" level=info msg="StartContainer for \"f1ac0579532a29d2d128350929c820f9c581678ef808cd8d11b9c5f38cfb4c80\"" Sep 6 00:10:40.995440 containerd[2004]: time="2025-09-06T00:10:40.995350453Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:10:40.995578 containerd[2004]: time="2025-09-06T00:10:40.995499829Z" level=info msg="RemovePodSandbox \"4d0648bf52858c3d876f2ec589529741c92898ef936de6372dfd79503d296b57\" returns successfully" Sep 6 00:10:40.997010 containerd[2004]: time="2025-09-06T00:10:40.996933817Z" level=info msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" Sep 6 00:10:41.117364 systemd[1]: Started cri-containerd-f1ac0579532a29d2d128350929c820f9c581678ef808cd8d11b9c5f38cfb4c80.scope - libcontainer container f1ac0579532a29d2d128350929c820f9c581678ef808cd8d11b9c5f38cfb4c80. Sep 6 00:10:41.205796 containerd[2004]: time="2025-09-06T00:10:41.204123274Z" level=info msg="StartContainer for \"f1ac0579532a29d2d128350929c820f9c581678ef808cd8d11b9c5f38cfb4c80\" returns successfully" Sep 6 00:10:41.266043 systemd-networkd[1845]: cali14200006449: Gained IPv6LL Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.244 [WARNING][5843] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"b640f503-9253-4063-8780-3c12db4c2e29", ResourceVersion:"956", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67", Pod:"goldmane-54d579b49d-fdn4p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.57.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid6b47a97bf2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.245 [INFO][5843] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.245 [INFO][5843] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" iface="eth0" netns="" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.246 [INFO][5843] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.246 [INFO][5843] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.309 [INFO][5870] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.311 [INFO][5870] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.311 [INFO][5870] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.332 [WARNING][5870] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.332 [INFO][5870] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.335 [INFO][5870] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:41.345946 containerd[2004]: 2025-09-06 00:10:41.341 [INFO][5843] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.348258 containerd[2004]: time="2025-09-06T00:10:41.347415227Z" level=info msg="TearDown network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" successfully" Sep 6 00:10:41.348258 containerd[2004]: time="2025-09-06T00:10:41.347464427Z" level=info msg="StopPodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" returns successfully" Sep 6 00:10:41.354970 containerd[2004]: time="2025-09-06T00:10:41.354868211Z" level=info msg="RemovePodSandbox for \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" Sep 6 00:10:41.355125 containerd[2004]: time="2025-09-06T00:10:41.354963275Z" level=info msg="Forcibly stopping sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\"" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.495 [WARNING][5892] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"b640f503-9253-4063-8780-3c12db4c2e29", ResourceVersion:"956", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67", Pod:"goldmane-54d579b49d-fdn4p", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.57.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid6b47a97bf2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.496 [INFO][5892] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.496 [INFO][5892] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" iface="eth0" netns="" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.496 [INFO][5892] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.496 [INFO][5892] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.560 [INFO][5900] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.560 [INFO][5900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.560 [INFO][5900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.609 [WARNING][5900] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.609 [INFO][5900] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" HandleID="k8s-pod-network.635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Workload="ip--172--31--26--146-k8s-goldmane--54d579b49d--fdn4p-eth0" Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.621 [INFO][5900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:10:41.635302 containerd[2004]: 2025-09-06 00:10:41.626 [INFO][5892] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba" Sep 6 00:10:41.635302 containerd[2004]: time="2025-09-06T00:10:41.635270916Z" level=info msg="TearDown network for sandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" successfully" Sep 6 00:10:41.645426 containerd[2004]: time="2025-09-06T00:10:41.645347040Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:10:41.645586 containerd[2004]: time="2025-09-06T00:10:41.645453624Z" level=info msg="RemovePodSandbox \"635dcdf4af9d3053307afb2843401eb42a20f17b6957dc0da56844bfff6148ba\" returns successfully" Sep 6 00:10:42.081690 systemd[1]: Started sshd@7-172.31.26.146:22-139.178.68.195:59870.service - OpenSSH per-connection server daemon (139.178.68.195:59870). Sep 6 00:10:42.288725 kubelet[3215]: I0906 00:10:42.286014 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-cfg8t" podStartSLOduration=57.285992724 podStartE2EDuration="57.285992724s" podCreationTimestamp="2025-09-06 00:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:10:42.240561191 +0000 UTC m=+63.324603891" watchObservedRunningTime="2025-09-06 00:10:42.285992724 +0000 UTC m=+63.370035400" Sep 6 00:10:42.320530 sshd[5911]: Accepted publickey for core from 139.178.68.195 port 59870 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:10:42.328702 sshd[5911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:10:42.352108 systemd-logind[1990]: New session 8 of user core. Sep 6 00:10:42.356524 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 6 00:10:42.609867 systemd-networkd[1845]: calib8952826cc4: Gained IPv6LL Sep 6 00:10:42.757249 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1225746737.mount: Deactivated successfully. Sep 6 00:10:42.791206 containerd[2004]: time="2025-09-06T00:10:42.791109254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:42.793183 containerd[2004]: time="2025-09-06T00:10:42.793023962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 6 00:10:42.795637 containerd[2004]: time="2025-09-06T00:10:42.795562562Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:42.800857 containerd[2004]: time="2025-09-06T00:10:42.800770898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:42.802495 containerd[2004]: time="2025-09-06T00:10:42.802313738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 3.700094022s" Sep 6 00:10:42.802495 containerd[2004]: time="2025-09-06T00:10:42.802367234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 6 00:10:42.805999 containerd[2004]: time="2025-09-06T00:10:42.805038254Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 6 00:10:42.808812 sshd[5911]: pam_unix(sshd:session): session closed for user core Sep 6 00:10:42.810525 containerd[2004]: time="2025-09-06T00:10:42.809594234Z" level=info msg="CreateContainer within sandbox \"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 6 00:10:42.818710 systemd[1]: sshd@7-172.31.26.146:22-139.178.68.195:59870.service: Deactivated successfully. Sep 6 00:10:42.822194 systemd[1]: session-8.scope: Deactivated successfully. Sep 6 00:10:42.827620 systemd-logind[1990]: Session 8 logged out. Waiting for processes to exit. Sep 6 00:10:42.852774 containerd[2004]: time="2025-09-06T00:10:42.852702278Z" level=info msg="CreateContainer within sandbox \"89694a57e32e4af6c75926a07ef1d8c007e550a7f3866a33d92596eced08a291\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50\"" Sep 6 00:10:42.856452 containerd[2004]: time="2025-09-06T00:10:42.856380446Z" level=info msg="StartContainer for \"65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50\"" Sep 6 00:10:42.861211 systemd-logind[1990]: Removed session 8. Sep 6 00:10:42.954581 systemd[1]: Started cri-containerd-65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50.scope - libcontainer container 65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50. Sep 6 00:10:43.070098 containerd[2004]: time="2025-09-06T00:10:43.069833268Z" level=info msg="StartContainer for \"65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50\" returns successfully" Sep 6 00:10:43.758174 systemd[1]: run-containerd-runc-k8s.io-65a0efef37bce4813ecf3b1aca5383299eb03d18851d4d966eb02deee9b83f50-runc.sIUbQJ.mount: Deactivated successfully. Sep 6 00:10:44.791456 ntpd[1984]: Listen normally on 7 vxlan.calico 192.168.57.64:123 Sep 6 00:10:44.791593 ntpd[1984]: Listen normally on 8 calif0eb8781880 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 7 vxlan.calico 192.168.57.64:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 8 calif0eb8781880 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 9 vxlan.calico [fe80::6402:f8ff:fee7:db2%5]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 10 cali0c2f9736f76 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 11 calid6b47a97bf2 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 12 calia09119fc40b [fe80::ecee:eeff:feee:eeee%10]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 13 calia8ac445ca91 [fe80::ecee:eeff:feee:eeee%11]:123 Sep 6 00:10:44.792093 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 14 caliaea676958aa [fe80::ecee:eeff:feee:eeee%12]:123 Sep 6 00:10:44.791676 ntpd[1984]: Listen normally on 9 vxlan.calico [fe80::6402:f8ff:fee7:db2%5]:123 Sep 6 00:10:44.792801 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 15 cali14200006449 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 6 00:10:44.792801 ntpd[1984]: 6 Sep 00:10:44 ntpd[1984]: Listen normally on 16 calib8952826cc4 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 6 00:10:44.791745 ntpd[1984]: Listen normally on 10 cali0c2f9736f76 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 6 00:10:44.791814 ntpd[1984]: Listen normally on 11 calid6b47a97bf2 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 6 00:10:44.791881 ntpd[1984]: Listen normally on 12 calia09119fc40b [fe80::ecee:eeff:feee:eeee%10]:123 Sep 6 00:10:44.791950 ntpd[1984]: Listen normally on 13 calia8ac445ca91 [fe80::ecee:eeff:feee:eeee%11]:123 Sep 6 00:10:44.792017 ntpd[1984]: Listen normally on 14 caliaea676958aa [fe80::ecee:eeff:feee:eeee%12]:123 Sep 6 00:10:44.792111 ntpd[1984]: Listen normally on 15 cali14200006449 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 6 00:10:44.792757 ntpd[1984]: Listen normally on 16 calib8952826cc4 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 6 00:10:45.354696 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3447220021.mount: Deactivated successfully. Sep 6 00:10:46.199667 containerd[2004]: time="2025-09-06T00:10:46.199586043Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:46.202771 containerd[2004]: time="2025-09-06T00:10:46.202336287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 6 00:10:46.205125 containerd[2004]: time="2025-09-06T00:10:46.205057551Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:46.214105 containerd[2004]: time="2025-09-06T00:10:46.214032255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:46.217203 containerd[2004]: time="2025-09-06T00:10:46.215833731Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.410725949s" Sep 6 00:10:46.217203 containerd[2004]: time="2025-09-06T00:10:46.215891559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 6 00:10:46.219345 containerd[2004]: time="2025-09-06T00:10:46.219193407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 00:10:46.221087 containerd[2004]: time="2025-09-06T00:10:46.221016231Z" level=info msg="CreateContainer within sandbox \"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 6 00:10:46.261141 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4158510028.mount: Deactivated successfully. Sep 6 00:10:46.274675 containerd[2004]: time="2025-09-06T00:10:46.274580967Z" level=info msg="CreateContainer within sandbox \"e7b1295a2a8fbbf3e63fb9ef519378b1c9265374da209224b982277408460d67\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130\"" Sep 6 00:10:46.280014 containerd[2004]: time="2025-09-06T00:10:46.275813235Z" level=info msg="StartContainer for \"e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130\"" Sep 6 00:10:46.404467 systemd[1]: Started cri-containerd-e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130.scope - libcontainer container e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130. Sep 6 00:10:46.516732 containerd[2004]: time="2025-09-06T00:10:46.516527117Z" level=info msg="StartContainer for \"e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130\" returns successfully" Sep 6 00:10:47.297657 kubelet[3215]: I0906 00:10:47.297511 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-fdn4p" podStartSLOduration=28.395781719 podStartE2EDuration="35.297426713s" podCreationTimestamp="2025-09-06 00:10:12 +0000 UTC" firstStartedPulling="2025-09-06 00:10:39.316033209 +0000 UTC m=+60.400075885" lastFinishedPulling="2025-09-06 00:10:46.217678203 +0000 UTC m=+67.301720879" observedRunningTime="2025-09-06 00:10:47.296245217 +0000 UTC m=+68.380287917" watchObservedRunningTime="2025-09-06 00:10:47.297426713 +0000 UTC m=+68.381469461" Sep 6 00:10:47.304315 kubelet[3215]: I0906 00:10:47.300791 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-b7dd89d5-zqd7v" podStartSLOduration=5.469376169 podStartE2EDuration="14.300766241s" podCreationTimestamp="2025-09-06 00:10:33 +0000 UTC" firstStartedPulling="2025-09-06 00:10:33.973097838 +0000 UTC m=+55.057140514" lastFinishedPulling="2025-09-06 00:10:42.80448791 +0000 UTC m=+63.888530586" observedRunningTime="2025-09-06 00:10:43.258513624 +0000 UTC m=+64.342556312" watchObservedRunningTime="2025-09-06 00:10:47.300766241 +0000 UTC m=+68.384808929" Sep 6 00:10:47.856909 systemd[1]: Started sshd@8-172.31.26.146:22-139.178.68.195:59884.service - OpenSSH per-connection server daemon (139.178.68.195:59884). Sep 6 00:10:48.045543 sshd[6060]: Accepted publickey for core from 139.178.68.195 port 59884 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:10:48.049693 sshd[6060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:10:48.060203 systemd-logind[1990]: New session 9 of user core. Sep 6 00:10:48.066869 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 6 00:10:48.380589 sshd[6060]: pam_unix(sshd:session): session closed for user core Sep 6 00:10:48.387444 systemd[1]: sshd@8-172.31.26.146:22-139.178.68.195:59884.service: Deactivated successfully. Sep 6 00:10:48.396089 systemd[1]: session-9.scope: Deactivated successfully. Sep 6 00:10:48.404223 systemd-logind[1990]: Session 9 logged out. Waiting for processes to exit. Sep 6 00:10:48.409826 systemd-logind[1990]: Removed session 9. Sep 6 00:10:50.676814 containerd[2004]: time="2025-09-06T00:10:50.676749033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:50.679534 containerd[2004]: time="2025-09-06T00:10:50.679467105Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 6 00:10:50.680602 containerd[2004]: time="2025-09-06T00:10:50.680512737Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:50.685504 containerd[2004]: time="2025-09-06T00:10:50.685433241Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:50.688747 containerd[2004]: time="2025-09-06T00:10:50.688469037Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 4.468615162s" Sep 6 00:10:50.688747 containerd[2004]: time="2025-09-06T00:10:50.688528185Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 6 00:10:50.692835 containerd[2004]: time="2025-09-06T00:10:50.691696077Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 6 00:10:50.700587 containerd[2004]: time="2025-09-06T00:10:50.700520901Z" level=info msg="CreateContainer within sandbox \"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 00:10:50.731985 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3993334115.mount: Deactivated successfully. Sep 6 00:10:50.742735 containerd[2004]: time="2025-09-06T00:10:50.738719218Z" level=info msg="CreateContainer within sandbox \"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8b654359ca40af14b034bc7b7f197d096ff65ba6db470085f8bbbd6e83cb1b7a\"" Sep 6 00:10:50.742735 containerd[2004]: time="2025-09-06T00:10:50.739781710Z" level=info msg="StartContainer for \"8b654359ca40af14b034bc7b7f197d096ff65ba6db470085f8bbbd6e83cb1b7a\"" Sep 6 00:10:50.830501 systemd[1]: Started cri-containerd-8b654359ca40af14b034bc7b7f197d096ff65ba6db470085f8bbbd6e83cb1b7a.scope - libcontainer container 8b654359ca40af14b034bc7b7f197d096ff65ba6db470085f8bbbd6e83cb1b7a. Sep 6 00:10:50.901259 containerd[2004]: time="2025-09-06T00:10:50.901101178Z" level=info msg="StartContainer for \"8b654359ca40af14b034bc7b7f197d096ff65ba6db470085f8bbbd6e83cb1b7a\" returns successfully" Sep 6 00:10:53.291045 kubelet[3215]: I0906 00:10:53.290956 3215 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 00:10:53.432979 systemd[1]: Started sshd@9-172.31.26.146:22-139.178.68.195:38564.service - OpenSSH per-connection server daemon (139.178.68.195:38564). Sep 6 00:10:53.635924 sshd[6157]: Accepted publickey for core from 139.178.68.195 port 38564 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:10:53.642526 sshd[6157]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:10:53.660495 systemd-logind[1990]: New session 10 of user core. Sep 6 00:10:53.666550 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 6 00:10:54.056729 sshd[6157]: pam_unix(sshd:session): session closed for user core Sep 6 00:10:54.073922 systemd-logind[1990]: Session 10 logged out. Waiting for processes to exit. Sep 6 00:10:54.076062 systemd[1]: sshd@9-172.31.26.146:22-139.178.68.195:38564.service: Deactivated successfully. Sep 6 00:10:54.089883 systemd[1]: session-10.scope: Deactivated successfully. Sep 6 00:10:54.119922 systemd[1]: Started sshd@10-172.31.26.146:22-139.178.68.195:38574.service - OpenSSH per-connection server daemon (139.178.68.195:38574). Sep 6 00:10:54.128824 systemd-logind[1990]: Removed session 10. Sep 6 00:10:54.365520 sshd[6170]: Accepted publickey for core from 139.178.68.195 port 38574 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:10:54.371893 sshd[6170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:10:54.402928 kubelet[3215]: I0906 00:10:54.393858 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8966f54db-8hj4r" podStartSLOduration=43.336498997 podStartE2EDuration="54.393834132s" podCreationTimestamp="2025-09-06 00:10:00 +0000 UTC" firstStartedPulling="2025-09-06 00:10:39.632309518 +0000 UTC m=+60.716352182" lastFinishedPulling="2025-09-06 00:10:50.689644629 +0000 UTC m=+71.773687317" observedRunningTime="2025-09-06 00:10:51.307030244 +0000 UTC m=+72.391073004" watchObservedRunningTime="2025-09-06 00:10:54.393834132 +0000 UTC m=+75.477876808" Sep 6 00:10:54.399350 systemd-logind[1990]: New session 11 of user core. Sep 6 00:10:54.403195 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 6 00:10:54.882243 sshd[6170]: pam_unix(sshd:session): session closed for user core Sep 6 00:10:54.892119 systemd[1]: sshd@10-172.31.26.146:22-139.178.68.195:38574.service: Deactivated successfully. Sep 6 00:10:54.903827 systemd[1]: session-11.scope: Deactivated successfully. Sep 6 00:10:54.910675 systemd-logind[1990]: Session 11 logged out. Waiting for processes to exit. Sep 6 00:10:54.938099 systemd[1]: Started sshd@11-172.31.26.146:22-139.178.68.195:38586.service - OpenSSH per-connection server daemon (139.178.68.195:38586). Sep 6 00:10:54.943512 systemd-logind[1990]: Removed session 11. Sep 6 00:10:55.134571 sshd[6183]: Accepted publickey for core from 139.178.68.195 port 38586 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:10:55.138204 sshd[6183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:10:55.155067 systemd-logind[1990]: New session 12 of user core. Sep 6 00:10:55.160805 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 6 00:10:55.314196 containerd[2004]: time="2025-09-06T00:10:55.312858432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:55.317874 containerd[2004]: time="2025-09-06T00:10:55.317817960Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 6 00:10:55.320959 containerd[2004]: time="2025-09-06T00:10:55.320896596Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:55.341715 containerd[2004]: time="2025-09-06T00:10:55.341635572Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 4.649863763s" Sep 6 00:10:55.343276 containerd[2004]: time="2025-09-06T00:10:55.341890164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 6 00:10:55.343276 containerd[2004]: time="2025-09-06T00:10:55.342796908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:55.352739 containerd[2004]: time="2025-09-06T00:10:55.352692925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 00:10:55.371644 containerd[2004]: time="2025-09-06T00:10:55.371566741Z" level=info msg="CreateContainer within sandbox \"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 6 00:10:55.458226 containerd[2004]: time="2025-09-06T00:10:55.458028949Z" level=info msg="CreateContainer within sandbox \"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4\"" Sep 6 00:10:55.470840 containerd[2004]: time="2025-09-06T00:10:55.470760733Z" level=info msg="StartContainer for \"de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4\"" Sep 6 00:10:55.546067 sshd[6183]: pam_unix(sshd:session): session closed for user core Sep 6 00:10:55.559165 systemd[1]: sshd@11-172.31.26.146:22-139.178.68.195:38586.service: Deactivated successfully. Sep 6 00:10:55.563650 systemd[1]: session-12.scope: Deactivated successfully. Sep 6 00:10:55.567779 systemd-logind[1990]: Session 12 logged out. Waiting for processes to exit. Sep 6 00:10:55.577485 systemd[1]: Started cri-containerd-de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4.scope - libcontainer container de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4. Sep 6 00:10:55.580322 systemd-logind[1990]: Removed session 12. Sep 6 00:10:55.648415 containerd[2004]: time="2025-09-06T00:10:55.648311342Z" level=info msg="StartContainer for \"de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4\" returns successfully" Sep 6 00:10:55.694269 containerd[2004]: time="2025-09-06T00:10:55.692899430Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:55.697581 containerd[2004]: time="2025-09-06T00:10:55.697479398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 6 00:10:55.702543 containerd[2004]: time="2025-09-06T00:10:55.702461462Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 349.503349ms" Sep 6 00:10:55.702760 containerd[2004]: time="2025-09-06T00:10:55.702727466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 6 00:10:55.706618 containerd[2004]: time="2025-09-06T00:10:55.706318202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 6 00:10:55.709604 containerd[2004]: time="2025-09-06T00:10:55.709452734Z" level=info msg="CreateContainer within sandbox \"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 00:10:55.735930 containerd[2004]: time="2025-09-06T00:10:55.735870998Z" level=info msg="CreateContainer within sandbox \"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b6bb77be07c28907d170db1980fb610a503b925aaa3ce223427956b81d53b280\"" Sep 6 00:10:55.750041 containerd[2004]: time="2025-09-06T00:10:55.749953935Z" level=info msg="StartContainer for \"b6bb77be07c28907d170db1980fb610a503b925aaa3ce223427956b81d53b280\"" Sep 6 00:10:55.817500 systemd[1]: Started cri-containerd-b6bb77be07c28907d170db1980fb610a503b925aaa3ce223427956b81d53b280.scope - libcontainer container b6bb77be07c28907d170db1980fb610a503b925aaa3ce223427956b81d53b280. Sep 6 00:10:55.932661 containerd[2004]: time="2025-09-06T00:10:55.932590239Z" level=info msg="StartContainer for \"b6bb77be07c28907d170db1980fb610a503b925aaa3ce223427956b81d53b280\" returns successfully" Sep 6 00:10:56.337742 kubelet[3215]: I0906 00:10:56.337617 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8966f54db-5528p" podStartSLOduration=40.525212399 podStartE2EDuration="56.337591405s" podCreationTimestamp="2025-09-06 00:10:00 +0000 UTC" firstStartedPulling="2025-09-06 00:10:39.892539072 +0000 UTC m=+60.976581748" lastFinishedPulling="2025-09-06 00:10:55.70491809 +0000 UTC m=+76.788960754" observedRunningTime="2025-09-06 00:10:56.331812061 +0000 UTC m=+77.415854737" watchObservedRunningTime="2025-09-06 00:10:56.337591405 +0000 UTC m=+77.421634081" Sep 6 00:10:56.460007 kubelet[3215]: I0906 00:10:56.459910 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5cf8b65958-8qctx" podStartSLOduration=29.931209276 podStartE2EDuration="45.459883454s" podCreationTimestamp="2025-09-06 00:10:11 +0000 UTC" firstStartedPulling="2025-09-06 00:10:39.818531195 +0000 UTC m=+60.902573871" lastFinishedPulling="2025-09-06 00:10:55.347205385 +0000 UTC m=+76.431248049" observedRunningTime="2025-09-06 00:10:56.389667038 +0000 UTC m=+77.473709738" watchObservedRunningTime="2025-09-06 00:10:56.459883454 +0000 UTC m=+77.543926130" Sep 6 00:10:57.317985 kubelet[3215]: I0906 00:10:57.317937 3215 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 00:10:57.464178 containerd[2004]: time="2025-09-06T00:10:57.461365083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:57.467859 containerd[2004]: time="2025-09-06T00:10:57.467190351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 6 00:10:57.477194 containerd[2004]: time="2025-09-06T00:10:57.475349343Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:57.482046 containerd[2004]: time="2025-09-06T00:10:57.481977579Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:57.485982 containerd[2004]: time="2025-09-06T00:10:57.485912871Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.779536373s" Sep 6 00:10:57.487707 containerd[2004]: time="2025-09-06T00:10:57.487634643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 6 00:10:57.493411 containerd[2004]: time="2025-09-06T00:10:57.493342227Z" level=info msg="CreateContainer within sandbox \"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 6 00:10:57.534427 containerd[2004]: time="2025-09-06T00:10:57.533600223Z" level=info msg="CreateContainer within sandbox \"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"69b110aa935af4136fe076d1c6b54677438926a313b1d5303670bfe063392ac5\"" Sep 6 00:10:57.536479 containerd[2004]: time="2025-09-06T00:10:57.536407755Z" level=info msg="StartContainer for \"69b110aa935af4136fe076d1c6b54677438926a313b1d5303670bfe063392ac5\"" Sep 6 00:10:57.620570 systemd[1]: Started cri-containerd-69b110aa935af4136fe076d1c6b54677438926a313b1d5303670bfe063392ac5.scope - libcontainer container 69b110aa935af4136fe076d1c6b54677438926a313b1d5303670bfe063392ac5. Sep 6 00:10:57.793780 containerd[2004]: time="2025-09-06T00:10:57.793703141Z" level=info msg="StartContainer for \"69b110aa935af4136fe076d1c6b54677438926a313b1d5303670bfe063392ac5\" returns successfully" Sep 6 00:10:57.797719 containerd[2004]: time="2025-09-06T00:10:57.797623613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 6 00:10:59.812611 containerd[2004]: time="2025-09-06T00:10:59.811088395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:59.815175 containerd[2004]: time="2025-09-06T00:10:59.815101471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 6 00:10:59.817417 containerd[2004]: time="2025-09-06T00:10:59.817361707Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:59.826928 containerd[2004]: time="2025-09-06T00:10:59.826865407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:10:59.829554 containerd[2004]: time="2025-09-06T00:10:59.829443847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.031727138s" Sep 6 00:10:59.829795 containerd[2004]: time="2025-09-06T00:10:59.829757635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 6 00:10:59.835122 containerd[2004]: time="2025-09-06T00:10:59.834864199Z" level=info msg="CreateContainer within sandbox \"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 6 00:10:59.884267 containerd[2004]: time="2025-09-06T00:10:59.884196115Z" level=info msg="CreateContainer within sandbox \"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7135d7ec47e204de36f12896dc3cafb9673cf67756d047743b67481210b2c877\"" Sep 6 00:10:59.886310 containerd[2004]: time="2025-09-06T00:10:59.885196267Z" level=info msg="StartContainer for \"7135d7ec47e204de36f12896dc3cafb9673cf67756d047743b67481210b2c877\"" Sep 6 00:11:00.020309 systemd[1]: Started cri-containerd-7135d7ec47e204de36f12896dc3cafb9673cf67756d047743b67481210b2c877.scope - libcontainer container 7135d7ec47e204de36f12896dc3cafb9673cf67756d047743b67481210b2c877. Sep 6 00:11:00.144815 containerd[2004]: time="2025-09-06T00:11:00.144657208Z" level=info msg="StartContainer for \"7135d7ec47e204de36f12896dc3cafb9673cf67756d047743b67481210b2c877\" returns successfully" Sep 6 00:11:00.592755 systemd[1]: Started sshd@12-172.31.26.146:22-139.178.68.195:45784.service - OpenSSH per-connection server daemon (139.178.68.195:45784). Sep 6 00:11:00.858487 sshd[6395]: Accepted publickey for core from 139.178.68.195 port 45784 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:00.862118 sshd[6395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:00.871279 systemd-logind[1990]: New session 13 of user core. Sep 6 00:11:00.877433 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 6 00:11:00.923341 kubelet[3215]: I0906 00:11:00.923261 3215 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 6 00:11:00.924462 kubelet[3215]: I0906 00:11:00.923439 3215 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 6 00:11:01.224702 sshd[6395]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:01.230965 systemd[1]: sshd@12-172.31.26.146:22-139.178.68.195:45784.service: Deactivated successfully. Sep 6 00:11:01.235547 systemd[1]: session-13.scope: Deactivated successfully. Sep 6 00:11:01.239892 systemd-logind[1990]: Session 13 logged out. Waiting for processes to exit. Sep 6 00:11:01.243040 systemd-logind[1990]: Removed session 13. Sep 6 00:11:01.866046 systemd[1]: run-containerd-runc-k8s.io-e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130-runc.kRsaw4.mount: Deactivated successfully. Sep 6 00:11:02.944872 systemd[1]: run-containerd-runc-k8s.io-005ef3f0b61c054443cd364a988c45918cef116c5ba508158a3889d4d2a724de-runc.t9hGtg.mount: Deactivated successfully. Sep 6 00:11:06.267698 systemd[1]: Started sshd@13-172.31.26.146:22-139.178.68.195:45786.service - OpenSSH per-connection server daemon (139.178.68.195:45786). Sep 6 00:11:06.448280 sshd[6457]: Accepted publickey for core from 139.178.68.195 port 45786 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:06.450856 sshd[6457]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:06.465633 systemd-logind[1990]: New session 14 of user core. Sep 6 00:11:06.473544 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 6 00:11:06.729090 sshd[6457]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:06.734865 systemd[1]: sshd@13-172.31.26.146:22-139.178.68.195:45786.service: Deactivated successfully. Sep 6 00:11:06.735650 systemd-logind[1990]: Session 14 logged out. Waiting for processes to exit. Sep 6 00:11:06.740125 systemd[1]: session-14.scope: Deactivated successfully. Sep 6 00:11:06.746914 systemd-logind[1990]: Removed session 14. Sep 6 00:11:11.769698 systemd[1]: Started sshd@14-172.31.26.146:22-139.178.68.195:43136.service - OpenSSH per-connection server daemon (139.178.68.195:43136). Sep 6 00:11:11.949753 sshd[6471]: Accepted publickey for core from 139.178.68.195 port 43136 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:11.954103 sshd[6471]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:11.966219 systemd-logind[1990]: New session 15 of user core. Sep 6 00:11:11.974476 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 6 00:11:12.273720 sshd[6471]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:12.284652 systemd[1]: sshd@14-172.31.26.146:22-139.178.68.195:43136.service: Deactivated successfully. Sep 6 00:11:12.291782 systemd[1]: session-15.scope: Deactivated successfully. Sep 6 00:11:12.296338 systemd-logind[1990]: Session 15 logged out. Waiting for processes to exit. Sep 6 00:11:12.300117 systemd-logind[1990]: Removed session 15. Sep 6 00:11:17.312700 systemd[1]: Started sshd@15-172.31.26.146:22-139.178.68.195:43144.service - OpenSSH per-connection server daemon (139.178.68.195:43144). Sep 6 00:11:17.498049 sshd[6492]: Accepted publickey for core from 139.178.68.195 port 43144 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:17.500867 sshd[6492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:17.508703 systemd-logind[1990]: New session 16 of user core. Sep 6 00:11:17.515449 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 6 00:11:17.775012 sshd[6492]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:17.783565 systemd[1]: sshd@15-172.31.26.146:22-139.178.68.195:43144.service: Deactivated successfully. Sep 6 00:11:17.783592 systemd-logind[1990]: Session 16 logged out. Waiting for processes to exit. Sep 6 00:11:17.790289 systemd[1]: session-16.scope: Deactivated successfully. Sep 6 00:11:17.793873 systemd-logind[1990]: Removed session 16. Sep 6 00:11:17.812839 systemd[1]: Started sshd@16-172.31.26.146:22-139.178.68.195:43152.service - OpenSSH per-connection server daemon (139.178.68.195:43152). Sep 6 00:11:17.990283 sshd[6505]: Accepted publickey for core from 139.178.68.195 port 43152 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:17.993189 sshd[6505]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:18.001272 systemd-logind[1990]: New session 17 of user core. Sep 6 00:11:18.011442 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 6 00:11:18.472409 kubelet[3215]: I0906 00:11:18.472302 3215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-47vdk" podStartSLOduration=47.806172653 podStartE2EDuration="1m7.472278827s" podCreationTimestamp="2025-09-06 00:10:11 +0000 UTC" firstStartedPulling="2025-09-06 00:10:40.165517245 +0000 UTC m=+61.249559921" lastFinishedPulling="2025-09-06 00:10:59.831623419 +0000 UTC m=+80.915666095" observedRunningTime="2025-09-06 00:11:00.370891661 +0000 UTC m=+81.454934349" watchObservedRunningTime="2025-09-06 00:11:18.472278827 +0000 UTC m=+99.556321503" Sep 6 00:11:18.700279 sshd[6505]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:18.708322 systemd[1]: sshd@16-172.31.26.146:22-139.178.68.195:43152.service: Deactivated successfully. Sep 6 00:11:18.712685 systemd[1]: session-17.scope: Deactivated successfully. Sep 6 00:11:18.716514 systemd-logind[1990]: Session 17 logged out. Waiting for processes to exit. Sep 6 00:11:18.719116 systemd-logind[1990]: Removed session 17. Sep 6 00:11:18.739751 systemd[1]: Started sshd@17-172.31.26.146:22-139.178.68.195:43158.service - OpenSSH per-connection server daemon (139.178.68.195:43158). Sep 6 00:11:18.912216 sshd[6540]: Accepted publickey for core from 139.178.68.195 port 43158 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:18.914280 sshd[6540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:18.923306 systemd-logind[1990]: New session 18 of user core. Sep 6 00:11:18.928565 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 6 00:11:19.101515 kubelet[3215]: I0906 00:11:19.100841 3215 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 00:11:20.152441 sshd[6540]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:20.175843 systemd[1]: sshd@17-172.31.26.146:22-139.178.68.195:43158.service: Deactivated successfully. Sep 6 00:11:20.184696 systemd[1]: session-18.scope: Deactivated successfully. Sep 6 00:11:20.187120 systemd-logind[1990]: Session 18 logged out. Waiting for processes to exit. Sep 6 00:11:20.207839 systemd[1]: Started sshd@18-172.31.26.146:22-139.178.68.195:37966.service - OpenSSH per-connection server daemon (139.178.68.195:37966). Sep 6 00:11:20.211997 systemd-logind[1990]: Removed session 18. Sep 6 00:11:20.396678 sshd[6563]: Accepted publickey for core from 139.178.68.195 port 37966 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:20.399441 sshd[6563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:20.407255 systemd-logind[1990]: New session 19 of user core. Sep 6 00:11:20.417504 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 6 00:11:20.964753 sshd[6563]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:20.971010 systemd[1]: session-19.scope: Deactivated successfully. Sep 6 00:11:20.972682 systemd[1]: sshd@18-172.31.26.146:22-139.178.68.195:37966.service: Deactivated successfully. Sep 6 00:11:20.984994 systemd-logind[1990]: Session 19 logged out. Waiting for processes to exit. Sep 6 00:11:21.007773 systemd[1]: Started sshd@19-172.31.26.146:22-139.178.68.195:37970.service - OpenSSH per-connection server daemon (139.178.68.195:37970). Sep 6 00:11:21.011062 systemd-logind[1990]: Removed session 19. Sep 6 00:11:21.184039 sshd[6575]: Accepted publickey for core from 139.178.68.195 port 37970 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:21.186760 sshd[6575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:21.194823 systemd-logind[1990]: New session 20 of user core. Sep 6 00:11:21.202419 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 6 00:11:21.441762 sshd[6575]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:21.452459 systemd[1]: sshd@19-172.31.26.146:22-139.178.68.195:37970.service: Deactivated successfully. Sep 6 00:11:21.459926 systemd[1]: session-20.scope: Deactivated successfully. Sep 6 00:11:21.464923 systemd-logind[1990]: Session 20 logged out. Waiting for processes to exit. Sep 6 00:11:21.467544 systemd-logind[1990]: Removed session 20. Sep 6 00:11:26.486708 systemd[1]: Started sshd@20-172.31.26.146:22-139.178.68.195:37982.service - OpenSSH per-connection server daemon (139.178.68.195:37982). Sep 6 00:11:26.662589 sshd[6628]: Accepted publickey for core from 139.178.68.195 port 37982 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:26.665327 sshd[6628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:26.674004 systemd-logind[1990]: New session 21 of user core. Sep 6 00:11:26.681434 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 6 00:11:26.913046 sshd[6628]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:26.920877 systemd[1]: sshd@20-172.31.26.146:22-139.178.68.195:37982.service: Deactivated successfully. Sep 6 00:11:26.925277 systemd[1]: session-21.scope: Deactivated successfully. Sep 6 00:11:26.927545 systemd-logind[1990]: Session 21 logged out. Waiting for processes to exit. Sep 6 00:11:26.930659 systemd-logind[1990]: Removed session 21. Sep 6 00:11:31.959062 systemd[1]: Started sshd@21-172.31.26.146:22-139.178.68.195:57840.service - OpenSSH per-connection server daemon (139.178.68.195:57840). Sep 6 00:11:32.128281 sshd[6642]: Accepted publickey for core from 139.178.68.195 port 57840 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:32.131178 sshd[6642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:32.142243 systemd-logind[1990]: New session 22 of user core. Sep 6 00:11:32.144448 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 6 00:11:32.380505 sshd[6642]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:32.386940 systemd[1]: sshd@21-172.31.26.146:22-139.178.68.195:57840.service: Deactivated successfully. Sep 6 00:11:32.392316 systemd[1]: session-22.scope: Deactivated successfully. Sep 6 00:11:32.396736 systemd-logind[1990]: Session 22 logged out. Waiting for processes to exit. Sep 6 00:11:32.399887 systemd-logind[1990]: Removed session 22. Sep 6 00:11:37.426841 systemd[1]: Started sshd@22-172.31.26.146:22-139.178.68.195:57854.service - OpenSSH per-connection server daemon (139.178.68.195:57854). Sep 6 00:11:37.628393 sshd[6677]: Accepted publickey for core from 139.178.68.195 port 57854 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:37.631909 sshd[6677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:37.646247 systemd-logind[1990]: New session 23 of user core. Sep 6 00:11:37.652490 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 6 00:11:37.953503 sshd[6677]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:37.960786 systemd[1]: sshd@22-172.31.26.146:22-139.178.68.195:57854.service: Deactivated successfully. Sep 6 00:11:37.968129 systemd[1]: session-23.scope: Deactivated successfully. Sep 6 00:11:37.975687 systemd-logind[1990]: Session 23 logged out. Waiting for processes to exit. Sep 6 00:11:37.980196 systemd-logind[1990]: Removed session 23. Sep 6 00:11:41.655343 containerd[2004]: time="2025-09-06T00:11:41.655227107Z" level=info msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.752 [WARNING][6699] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"9733554c-54e5-4004-bf21-b3d55e01bf9c", ResourceVersion:"1281", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028", Pod:"calico-apiserver-8966f54db-5528p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaea676958aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.753 [INFO][6699] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.753 [INFO][6699] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" iface="eth0" netns="" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.753 [INFO][6699] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.754 [INFO][6699] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.809 [INFO][6707] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.809 [INFO][6707] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.810 [INFO][6707] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.833 [WARNING][6707] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.833 [INFO][6707] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.837 [INFO][6707] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:41.848523 containerd[2004]: 2025-09-06 00:11:41.842 [INFO][6699] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:41.850053 containerd[2004]: time="2025-09-06T00:11:41.848615111Z" level=info msg="TearDown network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" successfully" Sep 6 00:11:41.850053 containerd[2004]: time="2025-09-06T00:11:41.848687759Z" level=info msg="StopPodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" returns successfully" Sep 6 00:11:41.851448 containerd[2004]: time="2025-09-06T00:11:41.851387543Z" level=info msg="RemovePodSandbox for \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" Sep 6 00:11:41.851535 containerd[2004]: time="2025-09-06T00:11:41.851459819Z" level=info msg="Forcibly stopping sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\"" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:41.957 [WARNING][6721] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"9733554c-54e5-4004-bf21-b3d55e01bf9c", ResourceVersion:"1281", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"75ad5a072fcf26c27b47464626f2e100585334229760817dd8eb45e39df22028", Pod:"calico-apiserver-8966f54db-5528p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaea676958aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:41.959 [INFO][6721] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:41.959 [INFO][6721] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" iface="eth0" netns="" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:41.959 [INFO][6721] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:41.959 [INFO][6721] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.017 [INFO][6728] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.018 [INFO][6728] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.018 [INFO][6728] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.047 [WARNING][6728] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.047 [INFO][6728] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" HandleID="k8s-pod-network.e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--5528p-eth0" Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.050 [INFO][6728] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:42.060426 containerd[2004]: 2025-09-06 00:11:42.055 [INFO][6721] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f" Sep 6 00:11:42.061302 containerd[2004]: time="2025-09-06T00:11:42.060388053Z" level=info msg="TearDown network for sandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" successfully" Sep 6 00:11:42.071019 containerd[2004]: time="2025-09-06T00:11:42.070672185Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:11:42.071019 containerd[2004]: time="2025-09-06T00:11:42.070785813Z" level=info msg="RemovePodSandbox \"e7b7715cfdb3e81cd54ac18f661b5ebe7bc4cf6acaae0bf0ed5b29057e1ab13f\" returns successfully" Sep 6 00:11:42.071871 containerd[2004]: time="2025-09-06T00:11:42.071813877Z" level=info msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.186 [WARNING][6742] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"44f64470-7851-42aa-8ed3-eb71e8151f7c", ResourceVersion:"1192", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666", Pod:"csi-node-driver-47vdk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali14200006449", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.187 [INFO][6742] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.187 [INFO][6742] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" iface="eth0" netns="" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.187 [INFO][6742] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.187 [INFO][6742] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.245 [INFO][6749] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.245 [INFO][6749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.245 [INFO][6749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.258 [WARNING][6749] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.258 [INFO][6749] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.261 [INFO][6749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:42.270236 containerd[2004]: 2025-09-06 00:11:42.264 [INFO][6742] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.270236 containerd[2004]: time="2025-09-06T00:11:42.268330978Z" level=info msg="TearDown network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" successfully" Sep 6 00:11:42.270236 containerd[2004]: time="2025-09-06T00:11:42.268368190Z" level=info msg="StopPodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" returns successfully" Sep 6 00:11:42.270236 containerd[2004]: time="2025-09-06T00:11:42.269451466Z" level=info msg="RemovePodSandbox for \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" Sep 6 00:11:42.270236 containerd[2004]: time="2025-09-06T00:11:42.269507734Z" level=info msg="Forcibly stopping sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\"" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.354 [WARNING][6763] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"44f64470-7851-42aa-8ed3-eb71e8151f7c", ResourceVersion:"1192", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"e20bc8a50c36428ac1d827a21ea8c7eb10e97972cf5cea3b331ca4acc51b8666", Pod:"csi-node-driver-47vdk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali14200006449", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.355 [INFO][6763] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.355 [INFO][6763] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" iface="eth0" netns="" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.355 [INFO][6763] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.355 [INFO][6763] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.407 [INFO][6770] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.408 [INFO][6770] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.409 [INFO][6770] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.426 [WARNING][6770] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.427 [INFO][6770] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" HandleID="k8s-pod-network.18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Workload="ip--172--31--26--146-k8s-csi--node--driver--47vdk-eth0" Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.429 [INFO][6770] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:42.436226 containerd[2004]: 2025-09-06 00:11:42.433 [INFO][6763] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55" Sep 6 00:11:42.437056 containerd[2004]: time="2025-09-06T00:11:42.436248790Z" level=info msg="TearDown network for sandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" successfully" Sep 6 00:11:42.451403 containerd[2004]: time="2025-09-06T00:11:42.451308394Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:11:42.452095 containerd[2004]: time="2025-09-06T00:11:42.451435090Z" level=info msg="RemovePodSandbox \"18d03f43051b792e5455707d0a5879d3b5724f750e3fcd44365fdd9afa67bc55\" returns successfully" Sep 6 00:11:42.452095 containerd[2004]: time="2025-09-06T00:11:42.452065582Z" level=info msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.562 [WARNING][6784] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6141a3f9-cce3-4c89-a723-17bc0b75f562", ResourceVersion:"1125", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab", Pod:"calico-apiserver-8966f54db-8hj4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia09119fc40b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.563 [INFO][6784] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.563 [INFO][6784] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" iface="eth0" netns="" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.563 [INFO][6784] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.563 [INFO][6784] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.610 [INFO][6791] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.611 [INFO][6791] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.611 [INFO][6791] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.629 [WARNING][6791] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.629 [INFO][6791] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.632 [INFO][6791] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:42.641841 containerd[2004]: 2025-09-06 00:11:42.636 [INFO][6784] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.641841 containerd[2004]: time="2025-09-06T00:11:42.639539591Z" level=info msg="TearDown network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" successfully" Sep 6 00:11:42.641841 containerd[2004]: time="2025-09-06T00:11:42.639576719Z" level=info msg="StopPodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" returns successfully" Sep 6 00:11:42.645004 containerd[2004]: time="2025-09-06T00:11:42.643794599Z" level=info msg="RemovePodSandbox for \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" Sep 6 00:11:42.645004 containerd[2004]: time="2025-09-06T00:11:42.644488343Z" level=info msg="Forcibly stopping sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\"" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.733 [WARNING][6805] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0", GenerateName:"calico-apiserver-8966f54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6141a3f9-cce3-4c89-a723-17bc0b75f562", ResourceVersion:"1125", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8966f54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"b2a40f6eea36aa0b539c5a04cc6e896531a59388e17e208a70c4312448fe5dab", Pod:"calico-apiserver-8966f54db-8hj4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia09119fc40b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.733 [INFO][6805] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.733 [INFO][6805] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" iface="eth0" netns="" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.733 [INFO][6805] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.733 [INFO][6805] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.787 [INFO][6812] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.787 [INFO][6812] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.787 [INFO][6812] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.800 [WARNING][6812] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.800 [INFO][6812] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" HandleID="k8s-pod-network.4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Workload="ip--172--31--26--146-k8s-calico--apiserver--8966f54db--8hj4r-eth0" Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.803 [INFO][6812] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:42.812173 containerd[2004]: 2025-09-06 00:11:42.807 [INFO][6805] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4" Sep 6 00:11:42.812173 containerd[2004]: time="2025-09-06T00:11:42.811738908Z" level=info msg="TearDown network for sandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" successfully" Sep 6 00:11:42.821900 containerd[2004]: time="2025-09-06T00:11:42.821419932Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:11:42.821900 containerd[2004]: time="2025-09-06T00:11:42.821536632Z" level=info msg="RemovePodSandbox \"4eb82306fb731de92488f3eaedfd54e8282ec37cf23d92f07c7cca386ab36ab4\" returns successfully" Sep 6 00:11:42.822865 containerd[2004]: time="2025-09-06T00:11:42.822403008Z" level=info msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" Sep 6 00:11:42.999090 systemd[1]: Started sshd@23-172.31.26.146:22-139.178.68.195:49012.service - OpenSSH per-connection server daemon (139.178.68.195:49012). Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.899 [WARNING][6826] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0", GenerateName:"calico-kube-controllers-5cf8b65958-", Namespace:"calico-system", SelfLink:"", UID:"aeae33f4-c2c1-4f17-a617-26feec2636ec", ResourceVersion:"1164", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cf8b65958", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee", Pod:"calico-kube-controllers-5cf8b65958-8qctx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia8ac445ca91", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.900 [INFO][6826] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.900 [INFO][6826] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" iface="eth0" netns="" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.900 [INFO][6826] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.900 [INFO][6826] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.953 [INFO][6833] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.954 [INFO][6833] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.954 [INFO][6833] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.994 [WARNING][6833] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:42.994 [INFO][6833] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:43.008 [INFO][6833] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:43.022778 containerd[2004]: 2025-09-06 00:11:43.017 [INFO][6826] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.023971 containerd[2004]: time="2025-09-06T00:11:43.022802649Z" level=info msg="TearDown network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" successfully" Sep 6 00:11:43.023971 containerd[2004]: time="2025-09-06T00:11:43.022844697Z" level=info msg="StopPodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" returns successfully" Sep 6 00:11:43.028316 containerd[2004]: time="2025-09-06T00:11:43.026006361Z" level=info msg="RemovePodSandbox for \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" Sep 6 00:11:43.028316 containerd[2004]: time="2025-09-06T00:11:43.026063973Z" level=info msg="Forcibly stopping sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\"" Sep 6 00:11:43.231189 sshd[6840]: Accepted publickey for core from 139.178.68.195 port 49012 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:43.232990 sshd[6840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:43.254575 systemd-logind[1990]: New session 24 of user core. Sep 6 00:11:43.264432 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.177 [WARNING][6849] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0", GenerateName:"calico-kube-controllers-5cf8b65958-", Namespace:"calico-system", SelfLink:"", UID:"aeae33f4-c2c1-4f17-a617-26feec2636ec", ResourceVersion:"1164", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 10, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cf8b65958", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"fee81c818d15912a26776f5f4c1456f7f1266d4dd891293b69747019002169ee", Pod:"calico-kube-controllers-5cf8b65958-8qctx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia8ac445ca91", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.177 [INFO][6849] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.177 [INFO][6849] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" iface="eth0" netns="" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.177 [INFO][6849] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.177 [INFO][6849] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.282 [INFO][6857] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.282 [INFO][6857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.282 [INFO][6857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.323 [WARNING][6857] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.324 [INFO][6857] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" HandleID="k8s-pod-network.43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Workload="ip--172--31--26--146-k8s-calico--kube--controllers--5cf8b65958--8qctx-eth0" Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.328 [INFO][6857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:43.336206 containerd[2004]: 2025-09-06 00:11:43.332 [INFO][6849] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0" Sep 6 00:11:43.336206 containerd[2004]: time="2025-09-06T00:11:43.335807531Z" level=info msg="TearDown network for sandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" successfully" Sep 6 00:11:43.348340 containerd[2004]: time="2025-09-06T00:11:43.346716611Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:11:43.348340 containerd[2004]: time="2025-09-06T00:11:43.346887827Z" level=info msg="RemovePodSandbox \"43e541cb34d867701054006e768953ef29780699246e345695891bcde06a48d0\" returns successfully" Sep 6 00:11:43.348340 containerd[2004]: time="2025-09-06T00:11:43.347621375Z" level=info msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" Sep 6 00:11:43.646047 sshd[6840]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.517 [WARNING][6872] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b8081eb5-874c-4704-87a1-8d99ed0c3d28", ResourceVersion:"1037", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1", Pod:"coredns-668d6bf9bc-cfg8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8952826cc4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.519 [INFO][6872] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.519 [INFO][6872] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" iface="eth0" netns="" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.519 [INFO][6872] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.519 [INFO][6872] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.614 [INFO][6885] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.615 [INFO][6885] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.615 [INFO][6885] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.638 [WARNING][6885] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.638 [INFO][6885] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.641 [INFO][6885] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:43.657376 containerd[2004]: 2025-09-06 00:11:43.649 [INFO][6872] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.657376 containerd[2004]: time="2025-09-06T00:11:43.657170208Z" level=info msg="TearDown network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" successfully" Sep 6 00:11:43.657376 containerd[2004]: time="2025-09-06T00:11:43.657209364Z" level=info msg="StopPodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" returns successfully" Sep 6 00:11:43.659016 systemd[1]: sshd@23-172.31.26.146:22-139.178.68.195:49012.service: Deactivated successfully. Sep 6 00:11:43.668542 containerd[2004]: time="2025-09-06T00:11:43.666764593Z" level=info msg="RemovePodSandbox for \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" Sep 6 00:11:43.668542 containerd[2004]: time="2025-09-06T00:11:43.667047469Z" level=info msg="Forcibly stopping sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\"" Sep 6 00:11:43.669434 systemd[1]: session-24.scope: Deactivated successfully. Sep 6 00:11:43.679402 systemd-logind[1990]: Session 24 logged out. Waiting for processes to exit. Sep 6 00:11:43.685727 systemd-logind[1990]: Removed session 24. Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.785 [WARNING][6901] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"b8081eb5-874c-4704-87a1-8d99ed0c3d28", ResourceVersion:"1037", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 0, 9, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-146", ContainerID:"7c4c3d82efa228940ae68957804a5c1c549c108025b262e1b5ebb312236c3ff1", Pod:"coredns-668d6bf9bc-cfg8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8952826cc4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.786 [INFO][6901] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.786 [INFO][6901] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" iface="eth0" netns="" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.786 [INFO][6901] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.786 [INFO][6901] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.865 [INFO][6908] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.865 [INFO][6908] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.866 [INFO][6908] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.881 [WARNING][6908] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.881 [INFO][6908] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" HandleID="k8s-pod-network.20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Workload="ip--172--31--26--146-k8s-coredns--668d6bf9bc--cfg8t-eth0" Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.885 [INFO][6908] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:11:43.894929 containerd[2004]: 2025-09-06 00:11:43.890 [INFO][6901] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769" Sep 6 00:11:43.894929 containerd[2004]: time="2025-09-06T00:11:43.894244130Z" level=info msg="TearDown network for sandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" successfully" Sep 6 00:11:43.938268 containerd[2004]: time="2025-09-06T00:11:43.904955162Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:11:43.938268 containerd[2004]: time="2025-09-06T00:11:43.937516106Z" level=info msg="RemovePodSandbox \"20e4211b6acfc6c5406ea80e120047458cfe1348f43c20a7758b76a153c56769\" returns successfully" Sep 6 00:11:48.326425 systemd[1]: run-containerd-runc-k8s.io-e3fc1f2ba4b5a887e4ec847195dd3094df34adc6d99bbc2b0012fe091340d130-runc.GawyiU.mount: Deactivated successfully. Sep 6 00:11:48.692392 systemd[1]: Started sshd@24-172.31.26.146:22-139.178.68.195:49020.service - OpenSSH per-connection server daemon (139.178.68.195:49020). Sep 6 00:11:48.881310 sshd[6938]: Accepted publickey for core from 139.178.68.195 port 49020 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:48.884776 sshd[6938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:48.896931 systemd-logind[1990]: New session 25 of user core. Sep 6 00:11:48.905129 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 6 00:11:49.185102 sshd[6938]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:49.192007 systemd[1]: sshd@24-172.31.26.146:22-139.178.68.195:49020.service: Deactivated successfully. Sep 6 00:11:49.192612 systemd-logind[1990]: Session 25 logged out. Waiting for processes to exit. Sep 6 00:11:49.198565 systemd[1]: session-25.scope: Deactivated successfully. Sep 6 00:11:49.205252 systemd-logind[1990]: Removed session 25. Sep 6 00:11:54.230360 systemd[1]: Started sshd@25-172.31.26.146:22-139.178.68.195:57526.service - OpenSSH per-connection server daemon (139.178.68.195:57526). Sep 6 00:11:54.414177 sshd[6951]: Accepted publickey for core from 139.178.68.195 port 57526 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:11:54.419688 sshd[6951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:11:54.431240 systemd-logind[1990]: New session 26 of user core. Sep 6 00:11:54.438477 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 6 00:11:54.788781 sshd[6951]: pam_unix(sshd:session): session closed for user core Sep 6 00:11:54.797645 systemd[1]: sshd@25-172.31.26.146:22-139.178.68.195:57526.service: Deactivated successfully. Sep 6 00:11:54.803754 systemd[1]: session-26.scope: Deactivated successfully. Sep 6 00:11:54.811554 systemd-logind[1990]: Session 26 logged out. Waiting for processes to exit. Sep 6 00:11:54.814883 systemd-logind[1990]: Removed session 26. Sep 6 00:11:56.391909 systemd[1]: run-containerd-runc-k8s.io-de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4-runc.XGfE2U.mount: Deactivated successfully. Sep 6 00:11:59.827948 systemd[1]: Started sshd@26-172.31.26.146:22-139.178.68.195:57532.service - OpenSSH per-connection server daemon (139.178.68.195:57532). Sep 6 00:12:00.027103 sshd[6989]: Accepted publickey for core from 139.178.68.195 port 57532 ssh2: RSA SHA256:sl+bjk3Rgvn/X7v/FsbKE7TCB1XfAe19Rc7ZHoWViq4 Sep 6 00:12:00.030455 sshd[6989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:12:00.052874 systemd-logind[1990]: New session 27 of user core. Sep 6 00:12:00.057511 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 6 00:12:00.364489 sshd[6989]: pam_unix(sshd:session): session closed for user core Sep 6 00:12:00.374079 systemd[1]: sshd@26-172.31.26.146:22-139.178.68.195:57532.service: Deactivated successfully. Sep 6 00:12:00.374545 systemd-logind[1990]: Session 27 logged out. Waiting for processes to exit. Sep 6 00:12:00.381894 systemd[1]: session-27.scope: Deactivated successfully. Sep 6 00:12:00.386597 systemd-logind[1990]: Removed session 27. Sep 6 00:12:14.202352 systemd[1]: cri-containerd-2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc.scope: Deactivated successfully. Sep 6 00:12:14.205328 systemd[1]: cri-containerd-2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc.scope: Consumed 5.505s CPU time, 19.4M memory peak, 0B memory swap peak. Sep 6 00:12:14.256920 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc-rootfs.mount: Deactivated successfully. Sep 6 00:12:14.268748 containerd[2004]: time="2025-09-06T00:12:14.249750700Z" level=info msg="shim disconnected" id=2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc namespace=k8s.io Sep 6 00:12:14.269743 containerd[2004]: time="2025-09-06T00:12:14.269415929Z" level=warning msg="cleaning up after shim disconnected" id=2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc namespace=k8s.io Sep 6 00:12:14.269743 containerd[2004]: time="2025-09-06T00:12:14.269460317Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:12:14.671511 kubelet[3215]: I0906 00:12:14.670836 3215 scope.go:117] "RemoveContainer" containerID="2403fe02943a750287f90ef75b89f1000e2077d55621bbfe678b80caa126bafc" Sep 6 00:12:14.675398 containerd[2004]: time="2025-09-06T00:12:14.675338083Z" level=info msg="CreateContainer within sandbox \"32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 6 00:12:14.706352 containerd[2004]: time="2025-09-06T00:12:14.706286635Z" level=info msg="CreateContainer within sandbox \"32936cbb319ad4e2061e52c18fc4b0bc4d8cba0e3566da8cc0d3454a3c07f436\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"642065e49e907f34c1364832dc110bbaf20cf1359e7b46aec45bddf029d1cb94\"" Sep 6 00:12:14.708409 containerd[2004]: time="2025-09-06T00:12:14.707086111Z" level=info msg="StartContainer for \"642065e49e907f34c1364832dc110bbaf20cf1359e7b46aec45bddf029d1cb94\"" Sep 6 00:12:14.769945 systemd[1]: Started cri-containerd-642065e49e907f34c1364832dc110bbaf20cf1359e7b46aec45bddf029d1cb94.scope - libcontainer container 642065e49e907f34c1364832dc110bbaf20cf1359e7b46aec45bddf029d1cb94. Sep 6 00:12:14.782194 systemd[1]: cri-containerd-103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53.scope: Deactivated successfully. Sep 6 00:12:14.784672 systemd[1]: cri-containerd-103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53.scope: Consumed 24.478s CPU time. Sep 6 00:12:14.832773 containerd[2004]: time="2025-09-06T00:12:14.832677079Z" level=info msg="shim disconnected" id=103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53 namespace=k8s.io Sep 6 00:12:14.833192 containerd[2004]: time="2025-09-06T00:12:14.833025727Z" level=warning msg="cleaning up after shim disconnected" id=103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53 namespace=k8s.io Sep 6 00:12:14.833192 containerd[2004]: time="2025-09-06T00:12:14.833094655Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:12:14.882473 containerd[2004]: time="2025-09-06T00:12:14.882349796Z" level=info msg="StartContainer for \"642065e49e907f34c1364832dc110bbaf20cf1359e7b46aec45bddf029d1cb94\" returns successfully" Sep 6 00:12:15.257536 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53-rootfs.mount: Deactivated successfully. Sep 6 00:12:15.681248 kubelet[3215]: I0906 00:12:15.680682 3215 scope.go:117] "RemoveContainer" containerID="2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794" Sep 6 00:12:15.682117 kubelet[3215]: I0906 00:12:15.681330 3215 scope.go:117] "RemoveContainer" containerID="103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53" Sep 6 00:12:15.682117 kubelet[3215]: E0906 00:12:15.681582 3215 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-755d956888-sqtsm_tigera-operator(6760618e-ab62-4b6e-93ea-2e4c636d64b7)\"" pod="tigera-operator/tigera-operator-755d956888-sqtsm" podUID="6760618e-ab62-4b6e-93ea-2e4c636d64b7" Sep 6 00:12:15.685622 containerd[2004]: time="2025-09-06T00:12:15.685016360Z" level=info msg="RemoveContainer for \"2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794\"" Sep 6 00:12:15.692487 containerd[2004]: time="2025-09-06T00:12:15.692286140Z" level=info msg="RemoveContainer for \"2c2d00f596b4a80b18413174032d23e7dcf3656e9bfd0fe9fe1b07b7a5063794\" returns successfully" Sep 6 00:12:19.568515 systemd[1]: cri-containerd-41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea.scope: Deactivated successfully. Sep 6 00:12:19.570556 systemd[1]: cri-containerd-41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea.scope: Consumed 4.298s CPU time, 15.5M memory peak, 0B memory swap peak. Sep 6 00:12:19.609977 containerd[2004]: time="2025-09-06T00:12:19.609883043Z" level=info msg="shim disconnected" id=41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea namespace=k8s.io Sep 6 00:12:19.612746 containerd[2004]: time="2025-09-06T00:12:19.612239987Z" level=warning msg="cleaning up after shim disconnected" id=41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea namespace=k8s.io Sep 6 00:12:19.612746 containerd[2004]: time="2025-09-06T00:12:19.612311051Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:12:19.614518 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea-rootfs.mount: Deactivated successfully. Sep 6 00:12:19.699821 kubelet[3215]: I0906 00:12:19.699784 3215 scope.go:117] "RemoveContainer" containerID="41447c17e0c91902d3a07e03efb4c0374a6a9b2c2f1a3b6bc2df6f760548c7ea" Sep 6 00:12:19.719750 containerd[2004]: time="2025-09-06T00:12:19.719673720Z" level=info msg="CreateContainer within sandbox \"cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 6 00:12:19.746046 containerd[2004]: time="2025-09-06T00:12:19.745868220Z" level=info msg="CreateContainer within sandbox \"cabad326bb4216ef82da9132d395cf591b0cf6cfa27dcd27538e185b019fff00\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"8aa7fb17bc6e072055675c8fe2d02f5b403917f5dcf8044e7371bd315de04867\"" Sep 6 00:12:19.747184 containerd[2004]: time="2025-09-06T00:12:19.746656692Z" level=info msg="StartContainer for \"8aa7fb17bc6e072055675c8fe2d02f5b403917f5dcf8044e7371bd315de04867\"" Sep 6 00:12:19.808474 systemd[1]: Started cri-containerd-8aa7fb17bc6e072055675c8fe2d02f5b403917f5dcf8044e7371bd315de04867.scope - libcontainer container 8aa7fb17bc6e072055675c8fe2d02f5b403917f5dcf8044e7371bd315de04867. Sep 6 00:12:19.876258 containerd[2004]: time="2025-09-06T00:12:19.876092364Z" level=info msg="StartContainer for \"8aa7fb17bc6e072055675c8fe2d02f5b403917f5dcf8044e7371bd315de04867\" returns successfully" Sep 6 00:12:22.211974 kubelet[3215]: E0906 00:12:22.210854 3215 controller.go:195] "Failed to update lease" err="Put \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 6 00:12:26.349682 systemd[1]: run-containerd-runc-k8s.io-de5b97d504c94d926aaaeadeff59e3986ea2e26ef69a38e721b6172390d725a4-runc.NEnytk.mount: Deactivated successfully. Sep 6 00:12:29.409360 kubelet[3215]: I0906 00:12:29.409222 3215 scope.go:117] "RemoveContainer" containerID="103f699aa9e6cd9701dfea9fe949818745ab9cc5ff8045de4fbce51339bc8b53" Sep 6 00:12:29.420766 containerd[2004]: time="2025-09-06T00:12:29.417807560Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Sep 6 00:12:29.450171 containerd[2004]: time="2025-09-06T00:12:29.448192664Z" level=info msg="CreateContainer within sandbox \"07e9e9e10e7edf5ded3d7d65c0de47fba40a8fcf899df1f5b0da1ae9993b9f1f\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7\"" Sep 6 00:12:29.451217 containerd[2004]: time="2025-09-06T00:12:29.451077872Z" level=info msg="StartContainer for \"d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7\"" Sep 6 00:12:29.513425 systemd[1]: run-containerd-runc-k8s.io-d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7-runc.n6G1Xk.mount: Deactivated successfully. Sep 6 00:12:29.524476 systemd[1]: Started cri-containerd-d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7.scope - libcontainer container d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7. Sep 6 00:12:29.572821 containerd[2004]: time="2025-09-06T00:12:29.572655261Z" level=info msg="StartContainer for \"d4d26c7df4a1a38f79c949fbcbeba9e93d37ee468b63eadf87459cbf26d06fb7\" returns successfully" Sep 6 00:12:32.211328 kubelet[3215]: E0906 00:12:32.211242 3215 controller.go:195] "Failed to update lease" err="Put \"https://172.31.26.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-146?timeout=10s\": context deadline exceeded"