Mar 7 00:54:26.300946 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 7 00:54:26.301007 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Mar 6 22:59:59 -00 2026 Mar 7 00:54:26.301038 kernel: KASLR disabled due to lack of seed Mar 7 00:54:26.301057 kernel: efi: EFI v2.7 by EDK II Mar 7 00:54:26.301073 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7b001a98 MEMRESERVE=0x7852ee18 Mar 7 00:54:26.301091 kernel: ACPI: Early table checksum verification disabled Mar 7 00:54:26.301110 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 7 00:54:26.301126 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 7 00:54:26.301143 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 7 00:54:26.301159 kernel: ACPI: DSDT 0x0000000078640000 0013D2 (v02 AMAZON AMZNDSDT 00000001 AMZN 00000001) Mar 7 00:54:26.301184 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 7 00:54:26.301201 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 7 00:54:26.301218 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 7 00:54:26.301235 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 7 00:54:26.301255 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 7 00:54:26.301277 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 7 00:54:26.301296 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 7 00:54:26.301313 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 7 00:54:26.301363 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 7 00:54:26.301384 kernel: printk: bootconsole [uart0] enabled Mar 7 00:54:26.301402 kernel: NUMA: Failed to initialise from firmware Mar 7 00:54:26.301421 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 7 00:54:26.301440 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Mar 7 00:54:26.301459 kernel: Zone ranges: Mar 7 00:54:26.301477 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 7 00:54:26.301495 kernel: DMA32 empty Mar 7 00:54:26.301521 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 7 00:54:26.301539 kernel: Movable zone start for each node Mar 7 00:54:26.301556 kernel: Early memory node ranges Mar 7 00:54:26.301574 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 7 00:54:26.301591 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 7 00:54:26.301609 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 7 00:54:26.301629 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 7 00:54:26.301646 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 7 00:54:26.301663 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 7 00:54:26.303883 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 7 00:54:26.303933 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 7 00:54:26.303953 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 7 00:54:26.303984 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 7 00:54:26.304004 kernel: psci: probing for conduit method from ACPI. Mar 7 00:54:26.304031 kernel: psci: PSCIv1.0 detected in firmware. Mar 7 00:54:26.304052 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 00:54:26.304071 kernel: psci: Trusted OS migration not required Mar 7 00:54:26.304099 kernel: psci: SMC Calling Convention v1.1 Mar 7 00:54:26.304122 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Mar 7 00:54:26.304145 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Mar 7 00:54:26.304164 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Mar 7 00:54:26.304185 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 00:54:26.304205 kernel: Detected PIPT I-cache on CPU0 Mar 7 00:54:26.304228 kernel: CPU features: detected: GIC system register CPU interface Mar 7 00:54:26.304248 kernel: CPU features: detected: Spectre-v2 Mar 7 00:54:26.304270 kernel: CPU features: detected: Spectre-v3a Mar 7 00:54:26.304289 kernel: CPU features: detected: Spectre-BHB Mar 7 00:54:26.304312 kernel: CPU features: detected: ARM erratum 1742098 Mar 7 00:54:26.304343 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 7 00:54:26.304363 kernel: alternatives: applying boot alternatives Mar 7 00:54:26.304385 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:54:26.304405 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 00:54:26.304424 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 00:54:26.304443 kernel: Fallback order for Node 0: 0 Mar 7 00:54:26.304463 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 7 00:54:26.304483 kernel: Policy zone: Normal Mar 7 00:54:26.304506 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 00:54:26.304526 kernel: software IO TLB: area num 2. Mar 7 00:54:26.304547 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 7 00:54:26.304580 kernel: Memory: 3820096K/4030464K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 210368K reserved, 0K cma-reserved) Mar 7 00:54:26.304602 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 00:54:26.304623 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 00:54:26.304646 kernel: rcu: RCU event tracing is enabled. Mar 7 00:54:26.304668 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 00:54:26.304739 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 00:54:26.304771 kernel: Tracing variant of Tasks RCU enabled. Mar 7 00:54:26.304794 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 00:54:26.304815 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 00:54:26.304835 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 00:54:26.304854 kernel: GICv3: 96 SPIs implemented Mar 7 00:54:26.304887 kernel: GICv3: 0 Extended SPIs implemented Mar 7 00:54:26.304906 kernel: Root IRQ handler: gic_handle_irq Mar 7 00:54:26.304926 kernel: GICv3: GICv3 features: 16 PPIs Mar 7 00:54:26.304945 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 7 00:54:26.304964 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 7 00:54:26.304982 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Mar 7 00:54:26.305002 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Mar 7 00:54:26.305020 kernel: GICv3: using LPI property table @0x00000004000d0000 Mar 7 00:54:26.305038 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 7 00:54:26.305057 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Mar 7 00:54:26.305075 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 00:54:26.305093 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 7 00:54:26.305119 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 7 00:54:26.305137 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 7 00:54:26.305156 kernel: Console: colour dummy device 80x25 Mar 7 00:54:26.305175 kernel: printk: console [tty1] enabled Mar 7 00:54:26.305194 kernel: ACPI: Core revision 20230628 Mar 7 00:54:26.305212 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 7 00:54:26.305231 kernel: pid_max: default: 32768 minimum: 301 Mar 7 00:54:26.305250 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 00:54:26.305268 kernel: landlock: Up and running. Mar 7 00:54:26.305292 kernel: SELinux: Initializing. Mar 7 00:54:26.305311 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:54:26.305358 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:54:26.305382 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:54:26.305401 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:54:26.305419 kernel: rcu: Hierarchical SRCU implementation. Mar 7 00:54:26.305439 kernel: rcu: Max phase no-delay instances is 400. Mar 7 00:54:26.305459 kernel: Platform MSI: ITS@0x10080000 domain created Mar 7 00:54:26.305478 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 7 00:54:26.305509 kernel: Remapping and enabling EFI services. Mar 7 00:54:26.305528 kernel: smp: Bringing up secondary CPUs ... Mar 7 00:54:26.305547 kernel: Detected PIPT I-cache on CPU1 Mar 7 00:54:26.305567 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 7 00:54:26.305587 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Mar 7 00:54:26.305607 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 7 00:54:26.305626 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 00:54:26.305645 kernel: SMP: Total of 2 processors activated. Mar 7 00:54:26.305663 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 00:54:26.310803 kernel: CPU features: detected: 32-bit EL1 Support Mar 7 00:54:26.310844 kernel: CPU features: detected: CRC32 instructions Mar 7 00:54:26.310864 kernel: CPU: All CPU(s) started at EL1 Mar 7 00:54:26.310907 kernel: alternatives: applying system-wide alternatives Mar 7 00:54:26.310934 kernel: devtmpfs: initialized Mar 7 00:54:26.310954 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 00:54:26.310974 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 00:54:26.310994 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 00:54:26.311014 kernel: SMBIOS 3.0.0 present. Mar 7 00:54:26.311039 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 7 00:54:26.311059 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 00:54:26.311080 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 00:54:26.311100 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 00:54:26.311121 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 00:54:26.311140 kernel: audit: initializing netlink subsys (disabled) Mar 7 00:54:26.311160 kernel: audit: type=2000 audit(0.295:1): state=initialized audit_enabled=0 res=1 Mar 7 00:54:26.311180 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 00:54:26.311208 kernel: cpuidle: using governor menu Mar 7 00:54:26.311228 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 00:54:26.311247 kernel: ASID allocator initialised with 65536 entries Mar 7 00:54:26.311267 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 00:54:26.311286 kernel: Serial: AMBA PL011 UART driver Mar 7 00:54:26.311306 kernel: Modules: 17488 pages in range for non-PLT usage Mar 7 00:54:26.311326 kernel: Modules: 509008 pages in range for PLT usage Mar 7 00:54:26.311345 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 00:54:26.311365 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 00:54:26.311393 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 00:54:26.311412 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 00:54:26.311432 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 00:54:26.311451 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 00:54:26.311471 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 00:54:26.311492 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 00:54:26.311511 kernel: ACPI: Added _OSI(Module Device) Mar 7 00:54:26.311530 kernel: ACPI: Added _OSI(Processor Device) Mar 7 00:54:26.311550 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 00:54:26.311578 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 00:54:26.311598 kernel: ACPI: Interpreter enabled Mar 7 00:54:26.311617 kernel: ACPI: Using GIC for interrupt routing Mar 7 00:54:26.311637 kernel: ACPI: MCFG table detected, 1 entries Mar 7 00:54:26.311656 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00]) Mar 7 00:54:26.312078 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 7 00:54:26.312378 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 7 00:54:26.312673 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 7 00:54:26.314656 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x200fffff] reserved by PNP0C02:00 Mar 7 00:54:26.314985 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x200fffff] for [bus 00] Mar 7 00:54:26.315018 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 7 00:54:26.315038 kernel: acpiphp: Slot [1] registered Mar 7 00:54:26.315058 kernel: acpiphp: Slot [2] registered Mar 7 00:54:26.315077 kernel: acpiphp: Slot [3] registered Mar 7 00:54:26.315096 kernel: acpiphp: Slot [4] registered Mar 7 00:54:26.315115 kernel: acpiphp: Slot [5] registered Mar 7 00:54:26.315145 kernel: acpiphp: Slot [6] registered Mar 7 00:54:26.315166 kernel: acpiphp: Slot [7] registered Mar 7 00:54:26.315185 kernel: acpiphp: Slot [8] registered Mar 7 00:54:26.315204 kernel: acpiphp: Slot [9] registered Mar 7 00:54:26.315223 kernel: acpiphp: Slot [10] registered Mar 7 00:54:26.315243 kernel: acpiphp: Slot [11] registered Mar 7 00:54:26.315262 kernel: acpiphp: Slot [12] registered Mar 7 00:54:26.315281 kernel: acpiphp: Slot [13] registered Mar 7 00:54:26.315301 kernel: acpiphp: Slot [14] registered Mar 7 00:54:26.315320 kernel: acpiphp: Slot [15] registered Mar 7 00:54:26.315347 kernel: acpiphp: Slot [16] registered Mar 7 00:54:26.315366 kernel: acpiphp: Slot [17] registered Mar 7 00:54:26.315386 kernel: acpiphp: Slot [18] registered Mar 7 00:54:26.315405 kernel: acpiphp: Slot [19] registered Mar 7 00:54:26.315424 kernel: acpiphp: Slot [20] registered Mar 7 00:54:26.315443 kernel: acpiphp: Slot [21] registered Mar 7 00:54:26.315462 kernel: acpiphp: Slot [22] registered Mar 7 00:54:26.315481 kernel: acpiphp: Slot [23] registered Mar 7 00:54:26.315501 kernel: acpiphp: Slot [24] registered Mar 7 00:54:26.315526 kernel: acpiphp: Slot [25] registered Mar 7 00:54:26.315546 kernel: acpiphp: Slot [26] registered Mar 7 00:54:26.315567 kernel: acpiphp: Slot [27] registered Mar 7 00:54:26.315586 kernel: acpiphp: Slot [28] registered Mar 7 00:54:26.315607 kernel: acpiphp: Slot [29] registered Mar 7 00:54:26.315626 kernel: acpiphp: Slot [30] registered Mar 7 00:54:26.315645 kernel: acpiphp: Slot [31] registered Mar 7 00:54:26.315665 kernel: PCI host bridge to bus 0000:00 Mar 7 00:54:26.318760 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 7 00:54:26.319052 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 7 00:54:26.319266 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 7 00:54:26.319482 kernel: pci_bus 0000:00: root bus resource [bus 00] Mar 7 00:54:26.319828 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 7 00:54:26.320144 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 7 00:54:26.320428 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 7 00:54:26.320798 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 7 00:54:26.321064 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 7 00:54:26.321338 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 7 00:54:26.321671 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 7 00:54:26.321998 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 7 00:54:26.322242 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 7 00:54:26.322486 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 7 00:54:26.322808 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 7 00:54:26.323055 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 7 00:54:26.323289 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 7 00:54:26.323513 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 7 00:54:26.323545 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 7 00:54:26.323566 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 7 00:54:26.323586 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 7 00:54:26.323606 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 7 00:54:26.323636 kernel: iommu: Default domain type: Translated Mar 7 00:54:26.323657 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 00:54:26.323703 kernel: efivars: Registered efivars operations Mar 7 00:54:26.323731 kernel: vgaarb: loaded Mar 7 00:54:26.323752 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 00:54:26.323771 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 00:54:26.323791 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 00:54:26.323811 kernel: pnp: PnP ACPI init Mar 7 00:54:26.324091 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 7 00:54:26.324135 kernel: pnp: PnP ACPI: found 1 devices Mar 7 00:54:26.324155 kernel: NET: Registered PF_INET protocol family Mar 7 00:54:26.324175 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 00:54:26.324194 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 00:54:26.324214 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 00:54:26.324233 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 00:54:26.324252 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 00:54:26.324272 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 00:54:26.324297 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:54:26.324316 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:54:26.324336 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 00:54:26.324357 kernel: PCI: CLS 0 bytes, default 64 Mar 7 00:54:26.324376 kernel: kvm [1]: HYP mode not available Mar 7 00:54:26.324396 kernel: Initialise system trusted keyrings Mar 7 00:54:26.324416 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 00:54:26.324435 kernel: Key type asymmetric registered Mar 7 00:54:26.324455 kernel: Asymmetric key parser 'x509' registered Mar 7 00:54:26.324482 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 7 00:54:26.324503 kernel: io scheduler mq-deadline registered Mar 7 00:54:26.324523 kernel: io scheduler kyber registered Mar 7 00:54:26.324542 kernel: io scheduler bfq registered Mar 7 00:54:26.324972 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 7 00:54:26.325017 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 7 00:54:26.325037 kernel: ACPI: button: Power Button [PWRB] Mar 7 00:54:26.325058 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 7 00:54:26.325087 kernel: ACPI: button: Sleep Button [SLPB] Mar 7 00:54:26.325108 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 00:54:26.325129 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 7 00:54:26.325434 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 7 00:54:26.325473 kernel: printk: console [ttyS0] disabled Mar 7 00:54:26.325494 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 7 00:54:26.325514 kernel: printk: console [ttyS0] enabled Mar 7 00:54:26.325534 kernel: printk: bootconsole [uart0] disabled Mar 7 00:54:26.325553 kernel: thunder_xcv, ver 1.0 Mar 7 00:54:26.325572 kernel: thunder_bgx, ver 1.0 Mar 7 00:54:26.325600 kernel: nicpf, ver 1.0 Mar 7 00:54:26.325619 kernel: nicvf, ver 1.0 Mar 7 00:54:26.325978 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 00:54:26.326218 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T00:54:25 UTC (1772844865) Mar 7 00:54:26.326252 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 00:54:26.326274 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 7 00:54:26.326295 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 7 00:54:26.326326 kernel: watchdog: Hard watchdog permanently disabled Mar 7 00:54:26.326346 kernel: NET: Registered PF_INET6 protocol family Mar 7 00:54:26.326366 kernel: Segment Routing with IPv6 Mar 7 00:54:26.326385 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 00:54:26.326404 kernel: NET: Registered PF_PACKET protocol family Mar 7 00:54:26.326424 kernel: Key type dns_resolver registered Mar 7 00:54:26.326443 kernel: registered taskstats version 1 Mar 7 00:54:26.326462 kernel: Loading compiled-in X.509 certificates Mar 7 00:54:26.326481 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: e62b4e4ebcb406beff1271ecc7444548c4ab67e9' Mar 7 00:54:26.326500 kernel: Key type .fscrypt registered Mar 7 00:54:26.326525 kernel: Key type fscrypt-provisioning registered Mar 7 00:54:26.326544 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 00:54:26.326563 kernel: ima: Allocated hash algorithm: sha1 Mar 7 00:54:26.326582 kernel: ima: No architecture policies found Mar 7 00:54:26.326601 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 00:54:26.326621 kernel: clk: Disabling unused clocks Mar 7 00:54:26.326640 kernel: Freeing unused kernel memory: 39424K Mar 7 00:54:26.326659 kernel: Run /init as init process Mar 7 00:54:26.326706 kernel: with arguments: Mar 7 00:54:26.326778 kernel: /init Mar 7 00:54:26.326798 kernel: with environment: Mar 7 00:54:26.326817 kernel: HOME=/ Mar 7 00:54:26.326837 kernel: TERM=linux Mar 7 00:54:26.326862 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:54:26.326886 systemd[1]: Detected virtualization amazon. Mar 7 00:54:26.326908 systemd[1]: Detected architecture arm64. Mar 7 00:54:26.326934 systemd[1]: Running in initrd. Mar 7 00:54:26.326955 systemd[1]: No hostname configured, using default hostname. Mar 7 00:54:26.326975 systemd[1]: Hostname set to . Mar 7 00:54:26.326997 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:54:26.327018 systemd[1]: Queued start job for default target initrd.target. Mar 7 00:54:26.327040 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:54:26.327061 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:54:26.327082 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 00:54:26.327109 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:54:26.327131 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 00:54:26.327153 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 00:54:26.327177 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 00:54:26.327199 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 00:54:26.327221 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:54:26.327242 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:54:26.327267 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:54:26.327289 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:54:26.327309 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:54:26.327330 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:54:26.327352 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:54:26.327373 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:54:26.327395 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 00:54:26.327416 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 00:54:26.327439 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:54:26.327469 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:54:26.327491 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:54:26.327513 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:54:26.327536 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 00:54:26.327559 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:54:26.327581 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 00:54:26.327603 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 00:54:26.327625 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:54:26.327653 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:54:26.327675 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:54:26.327759 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 00:54:26.327782 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:54:26.327867 systemd-journald[252]: Collecting audit messages is disabled. Mar 7 00:54:26.327928 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 00:54:26.327953 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:54:26.327975 systemd-journald[252]: Journal started Mar 7 00:54:26.328021 systemd-journald[252]: Runtime Journal (/run/log/journal/ec2abaa48ad55089138d8d654eced059) is 8.0M, max 75.3M, 67.3M free. Mar 7 00:54:26.334653 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 00:54:26.294194 systemd-modules-load[253]: Inserted module 'overlay' Mar 7 00:54:26.346625 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:54:26.346773 kernel: Bridge firewalling registered Mar 7 00:54:26.342824 systemd-modules-load[253]: Inserted module 'br_netfilter' Mar 7 00:54:26.349177 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:54:26.357853 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:54:26.365988 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:54:26.384981 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:54:26.400954 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:54:26.409179 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:54:26.422984 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:54:26.452798 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:54:26.470284 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:54:26.477405 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:54:26.491133 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 00:54:26.501463 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:54:26.520015 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:54:26.548106 dracut-cmdline[286]: dracut-dracut-053 Mar 7 00:54:26.560774 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:54:26.619081 systemd-resolved[288]: Positive Trust Anchors: Mar 7 00:54:26.619129 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:54:26.619195 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:54:26.746722 kernel: SCSI subsystem initialized Mar 7 00:54:26.754726 kernel: Loading iSCSI transport class v2.0-870. Mar 7 00:54:26.767720 kernel: iscsi: registered transport (tcp) Mar 7 00:54:26.790846 kernel: iscsi: registered transport (qla4xxx) Mar 7 00:54:26.790922 kernel: QLogic iSCSI HBA Driver Mar 7 00:54:26.867810 kernel: random: crng init done Mar 7 00:54:26.868097 systemd-resolved[288]: Defaulting to hostname 'linux'. Mar 7 00:54:26.875453 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:54:26.880650 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:54:26.903064 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 00:54:26.919005 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 00:54:26.948921 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 00:54:26.949001 kernel: device-mapper: uevent: version 1.0.3 Mar 7 00:54:26.949029 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 00:54:27.017733 kernel: raid6: neonx8 gen() 6710 MB/s Mar 7 00:54:27.034721 kernel: raid6: neonx4 gen() 6543 MB/s Mar 7 00:54:27.052717 kernel: raid6: neonx2 gen() 4330 MB/s Mar 7 00:54:27.070747 kernel: raid6: neonx1 gen() 3956 MB/s Mar 7 00:54:27.087720 kernel: raid6: int64x8 gen() 3635 MB/s Mar 7 00:54:27.104718 kernel: raid6: int64x4 gen() 3703 MB/s Mar 7 00:54:27.122740 kernel: raid6: int64x2 gen() 3593 MB/s Mar 7 00:54:27.140949 kernel: raid6: int64x1 gen() 2724 MB/s Mar 7 00:54:27.141053 kernel: raid6: using algorithm neonx8 gen() 6710 MB/s Mar 7 00:54:27.159775 kernel: raid6: .... xor() 4833 MB/s, rmw enabled Mar 7 00:54:27.159866 kernel: raid6: using neon recovery algorithm Mar 7 00:54:27.169574 kernel: xor: measuring software checksum speed Mar 7 00:54:27.169656 kernel: 8regs : 11023 MB/sec Mar 7 00:54:27.172221 kernel: 32regs : 11217 MB/sec Mar 7 00:54:27.172277 kernel: arm64_neon : 9526 MB/sec Mar 7 00:54:27.172317 kernel: xor: using function: 32regs (11217 MB/sec) Mar 7 00:54:27.264758 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 00:54:27.289022 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:54:27.306220 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:54:27.347850 systemd-udevd[470]: Using default interface naming scheme 'v255'. Mar 7 00:54:27.358057 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:54:27.370100 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 00:54:27.412318 dracut-pre-trigger[471]: rd.md=0: removing MD RAID activation Mar 7 00:54:27.483736 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:54:27.497949 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:54:27.617178 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:54:27.636048 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 00:54:27.684859 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 00:54:27.690745 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:54:27.696748 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:54:27.702378 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:54:27.719980 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 00:54:27.759375 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:54:27.845800 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 7 00:54:27.845883 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 7 00:54:27.855198 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 7 00:54:27.855617 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 7 00:54:27.858719 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:54:27.861386 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:54:27.872873 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:54:27.881468 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:54:27.884173 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:54:27.890551 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:54:27.900926 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80110000, mac addr 06:fe:fd:3f:f9:05 Mar 7 00:54:27.903297 (udev-worker)[540]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:54:27.904200 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:54:27.939751 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 7 00:54:27.939841 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 7 00:54:27.952001 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 7 00:54:27.953770 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:54:27.965149 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:54:27.974453 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 7 00:54:27.974524 kernel: GPT:9289727 != 33554431 Mar 7 00:54:27.975979 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 7 00:54:27.977038 kernel: GPT:9289727 != 33554431 Mar 7 00:54:27.978495 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 7 00:54:27.979709 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:54:28.013109 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:54:28.063755 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/nvme0n1p6 scanned by (udev-worker) (534) Mar 7 00:54:28.088798 kernel: BTRFS: device fsid 237c8587-8110-47ef-99f9-37e4ed4d3b31 devid 1 transid 36 /dev/nvme0n1p3 scanned by (udev-worker) (516) Mar 7 00:54:28.170379 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Mar 7 00:54:28.191082 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Mar 7 00:54:28.234222 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Mar 7 00:54:28.238272 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Mar 7 00:54:28.261367 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 7 00:54:28.280064 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 00:54:28.300716 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:54:28.302440 disk-uuid[660]: Primary Header is updated. Mar 7 00:54:28.302440 disk-uuid[660]: Secondary Entries is updated. Mar 7 00:54:28.302440 disk-uuid[660]: Secondary Header is updated. Mar 7 00:54:28.350721 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:54:29.341726 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:54:29.343251 disk-uuid[661]: The operation has completed successfully. Mar 7 00:54:29.540724 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 00:54:29.540947 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 00:54:29.606990 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 00:54:29.620665 sh[1002]: Success Mar 7 00:54:29.649945 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 7 00:54:29.772636 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 00:54:29.788048 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 00:54:29.802776 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 00:54:29.829353 kernel: BTRFS info (device dm-0): first mount of filesystem 237c8587-8110-47ef-99f9-37e4ed4d3b31 Mar 7 00:54:29.829442 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:54:29.831731 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 00:54:29.831802 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 00:54:29.833231 kernel: BTRFS info (device dm-0): using free space tree Mar 7 00:54:29.923733 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 7 00:54:29.954509 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 00:54:29.955718 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 00:54:29.973097 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 00:54:29.979254 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 00:54:30.004531 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:54:30.004598 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:54:30.007725 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:54:30.016765 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:54:30.039789 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:54:30.038027 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 00:54:30.055985 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 00:54:30.071064 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 00:54:30.207253 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:54:30.222991 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:54:30.289949 systemd-networkd[1207]: lo: Link UP Mar 7 00:54:30.289970 systemd-networkd[1207]: lo: Gained carrier Mar 7 00:54:30.295501 systemd-networkd[1207]: Enumeration completed Mar 7 00:54:30.297111 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:54:30.297835 systemd-networkd[1207]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:54:30.297843 systemd-networkd[1207]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:54:30.303224 systemd-networkd[1207]: eth0: Link UP Mar 7 00:54:30.303232 systemd-networkd[1207]: eth0: Gained carrier Mar 7 00:54:30.303251 systemd-networkd[1207]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:54:30.308828 systemd[1]: Reached target network.target - Network. Mar 7 00:54:30.337817 systemd-networkd[1207]: eth0: DHCPv4 address 172.31.23.166/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 7 00:54:30.509134 ignition[1114]: Ignition 2.19.0 Mar 7 00:54:30.509162 ignition[1114]: Stage: fetch-offline Mar 7 00:54:30.516480 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:54:30.510854 ignition[1114]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:30.510880 ignition[1114]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:30.511480 ignition[1114]: Ignition finished successfully Mar 7 00:54:30.531377 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 00:54:30.570188 ignition[1216]: Ignition 2.19.0 Mar 7 00:54:30.570217 ignition[1216]: Stage: fetch Mar 7 00:54:30.572937 ignition[1216]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:30.573677 ignition[1216]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:30.574014 ignition[1216]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:30.604183 ignition[1216]: PUT result: OK Mar 7 00:54:30.608150 ignition[1216]: parsed url from cmdline: "" Mar 7 00:54:30.608312 ignition[1216]: no config URL provided Mar 7 00:54:30.610121 ignition[1216]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:54:30.610158 ignition[1216]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:54:30.610198 ignition[1216]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:30.613538 ignition[1216]: PUT result: OK Mar 7 00:54:30.613635 ignition[1216]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 7 00:54:30.616075 ignition[1216]: GET result: OK Mar 7 00:54:30.616269 ignition[1216]: parsing config with SHA512: 019fd0fe285c0b6f3ad2d1c3a8448eac251dc327f9c9d7ac31b2c0fa0c0cca13d31212faac3745028782cb21953dcf2a42a70a4dc2dbcd87fae2c8428c0bff52 Mar 7 00:54:30.629889 unknown[1216]: fetched base config from "system" Mar 7 00:54:30.629922 unknown[1216]: fetched base config from "system" Mar 7 00:54:30.629937 unknown[1216]: fetched user config from "aws" Mar 7 00:54:30.639607 ignition[1216]: fetch: fetch complete Mar 7 00:54:30.639621 ignition[1216]: fetch: fetch passed Mar 7 00:54:30.641742 ignition[1216]: Ignition finished successfully Mar 7 00:54:30.645188 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 00:54:30.657040 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 00:54:30.687675 ignition[1222]: Ignition 2.19.0 Mar 7 00:54:30.687735 ignition[1222]: Stage: kargs Mar 7 00:54:30.689653 ignition[1222]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:30.689720 ignition[1222]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:30.690421 ignition[1222]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:30.693806 ignition[1222]: PUT result: OK Mar 7 00:54:30.703429 ignition[1222]: kargs: kargs passed Mar 7 00:54:30.703579 ignition[1222]: Ignition finished successfully Mar 7 00:54:30.708756 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 00:54:30.719984 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 00:54:30.749130 ignition[1228]: Ignition 2.19.0 Mar 7 00:54:30.750543 ignition[1228]: Stage: disks Mar 7 00:54:30.752055 ignition[1228]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:30.752089 ignition[1228]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:30.752253 ignition[1228]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:30.758563 ignition[1228]: PUT result: OK Mar 7 00:54:30.765289 ignition[1228]: disks: disks passed Mar 7 00:54:30.765497 ignition[1228]: Ignition finished successfully Mar 7 00:54:30.770011 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 00:54:30.776269 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 00:54:30.780885 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 00:54:30.792240 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:54:30.796033 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:54:30.800866 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:54:30.817614 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 00:54:30.862507 systemd-fsck[1236]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 7 00:54:30.869124 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 00:54:30.884129 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 00:54:30.977768 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 596a8ea8-9d3d-4d06-a56e-9d3ebd3cb76d r/w with ordered data mode. Quota mode: none. Mar 7 00:54:30.979368 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 00:54:30.984647 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 00:54:30.999937 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:54:31.009761 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 00:54:31.014854 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 7 00:54:31.014953 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 00:54:31.015002 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:54:31.036733 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/nvme0n1p6 scanned by mount (1256) Mar 7 00:54:31.042452 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:54:31.042527 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:54:31.043877 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:54:31.050858 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 00:54:31.059324 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:54:31.063004 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 00:54:31.071783 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:54:31.463345 initrd-setup-root[1280]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 00:54:31.495538 initrd-setup-root[1287]: cut: /sysroot/etc/group: No such file or directory Mar 7 00:54:31.505409 initrd-setup-root[1294]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 00:54:31.513870 initrd-setup-root[1301]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 00:54:31.908523 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 00:54:31.924157 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 00:54:31.929838 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 00:54:31.951932 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 00:54:31.954425 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:54:32.004155 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 00:54:32.015110 ignition[1369]: INFO : Ignition 2.19.0 Mar 7 00:54:32.015110 ignition[1369]: INFO : Stage: mount Mar 7 00:54:32.019106 ignition[1369]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:32.019106 ignition[1369]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:32.019106 ignition[1369]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:32.027189 ignition[1369]: INFO : PUT result: OK Mar 7 00:54:32.031604 ignition[1369]: INFO : mount: mount passed Mar 7 00:54:32.033859 ignition[1369]: INFO : Ignition finished successfully Mar 7 00:54:32.034884 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 00:54:32.051631 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 00:54:32.075950 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:54:32.099723 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 scanned by mount (1380) Mar 7 00:54:32.103732 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:54:32.103799 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:54:32.103826 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:54:32.111749 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:54:32.113799 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:54:32.153460 ignition[1397]: INFO : Ignition 2.19.0 Mar 7 00:54:32.155707 ignition[1397]: INFO : Stage: files Mar 7 00:54:32.155707 ignition[1397]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:32.155707 ignition[1397]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:32.155707 ignition[1397]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:32.166281 ignition[1397]: INFO : PUT result: OK Mar 7 00:54:32.170862 ignition[1397]: DEBUG : files: compiled without relabeling support, skipping Mar 7 00:54:32.182736 ignition[1397]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 00:54:32.182736 ignition[1397]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 00:54:32.249785 ignition[1397]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 00:54:32.253185 ignition[1397]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 00:54:32.256431 ignition[1397]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 00:54:32.254571 unknown[1397]: wrote ssh authorized keys file for user: core Mar 7 00:54:32.261977 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:54:32.261977 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 00:54:32.351149 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 00:54:32.366942 systemd-networkd[1207]: eth0: Gained IPv6LL Mar 7 00:54:32.496242 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:54:32.500870 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 00:54:32.505168 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 00:54:32.509482 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:54:32.513558 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:54:32.517616 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 00:54:32.523362 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Mar 7 00:54:33.027624 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 00:54:33.498470 ignition[1397]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 7 00:54:33.498470 ignition[1397]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 00:54:33.511111 ignition[1397]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:54:33.515703 ignition[1397]: INFO : files: files passed Mar 7 00:54:33.515703 ignition[1397]: INFO : Ignition finished successfully Mar 7 00:54:33.541380 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 00:54:33.558145 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 00:54:33.575121 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 00:54:33.587040 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 00:54:33.589855 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 00:54:33.609260 initrd-setup-root-after-ignition[1425]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:54:33.609260 initrd-setup-root-after-ignition[1425]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:54:33.620073 initrd-setup-root-after-ignition[1429]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:54:33.625765 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:54:33.636943 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 00:54:33.651993 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 00:54:33.724647 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 00:54:33.725180 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 00:54:33.734374 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 00:54:33.742368 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 00:54:33.747655 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 00:54:33.759169 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 00:54:33.792912 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:54:33.810281 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 00:54:33.849059 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:54:33.856987 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:54:33.857462 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 00:54:33.859290 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 00:54:33.859613 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:54:33.861138 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 00:54:33.862994 systemd[1]: Stopped target basic.target - Basic System. Mar 7 00:54:33.864553 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 00:54:33.865378 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:54:33.866643 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 00:54:33.867480 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 00:54:33.867931 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:54:33.868360 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 00:54:33.868800 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 00:54:33.869186 systemd[1]: Stopped target swap.target - Swaps. Mar 7 00:54:33.869992 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 00:54:33.870286 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:54:33.871885 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:54:33.872812 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:54:33.873590 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 00:54:33.922837 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:54:33.925953 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 00:54:33.926224 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 00:54:33.936094 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 00:54:33.936443 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:54:33.940565 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 00:54:33.940896 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 00:54:33.957114 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 00:54:33.968614 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 00:54:34.058113 ignition[1449]: INFO : Ignition 2.19.0 Mar 7 00:54:34.058113 ignition[1449]: INFO : Stage: umount Mar 7 00:54:34.058113 ignition[1449]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:54:34.058113 ignition[1449]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:54:34.058113 ignition[1449]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:54:34.007864 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 00:54:34.085086 ignition[1449]: INFO : PUT result: OK Mar 7 00:54:34.008184 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:54:34.090534 ignition[1449]: INFO : umount: umount passed Mar 7 00:54:34.090534 ignition[1449]: INFO : Ignition finished successfully Mar 7 00:54:34.013228 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 00:54:34.013541 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:54:34.065724 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 00:54:34.067815 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 00:54:34.100733 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 00:54:34.101024 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 00:54:34.108736 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 00:54:34.108946 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 00:54:34.119142 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 00:54:34.119261 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 00:54:34.122805 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 00:54:34.122914 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 00:54:34.137624 systemd[1]: Stopped target network.target - Network. Mar 7 00:54:34.147203 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 00:54:34.147977 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:54:34.159840 systemd[1]: Stopped target paths.target - Path Units. Mar 7 00:54:34.167884 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 00:54:34.173377 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:54:34.180603 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 00:54:34.184549 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 00:54:34.189536 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 00:54:34.189637 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:54:34.195167 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 00:54:34.195255 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:54:34.195433 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 00:54:34.195536 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 00:54:34.196175 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 00:54:34.196273 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 00:54:34.197379 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 00:54:34.211519 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 00:54:34.225548 systemd-networkd[1207]: eth0: DHCPv6 lease lost Mar 7 00:54:34.229120 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 00:54:34.241244 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 00:54:34.244851 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 00:54:34.253515 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 00:54:34.253817 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 00:54:34.265498 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 00:54:34.265623 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:54:34.295905 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 00:54:34.301425 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 00:54:34.301566 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:54:34.318881 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 00:54:34.319005 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:54:34.324528 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 00:54:34.324650 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 00:54:34.330079 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 00:54:34.330192 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:54:34.333432 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:54:34.381285 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 00:54:34.381562 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 00:54:34.390586 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 00:54:34.392193 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 00:54:34.399290 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 00:54:34.400923 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 00:54:34.409528 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 00:54:34.413854 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:54:34.421473 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 00:54:34.422329 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 00:54:34.429360 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 00:54:34.429465 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:54:34.432180 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 00:54:34.432305 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:54:34.435929 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 00:54:34.436071 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 00:54:34.452627 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:54:34.452802 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:54:34.478009 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 00:54:34.488862 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 00:54:34.489866 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:54:34.508154 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 7 00:54:34.508292 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:54:34.511403 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 00:54:34.511529 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:54:34.514656 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:54:34.514823 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:54:34.521199 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 00:54:34.521515 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 00:54:34.526653 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 00:54:34.552605 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 00:54:34.579066 systemd[1]: Switching root. Mar 7 00:54:34.637859 systemd-journald[252]: Journal stopped Mar 7 00:54:37.207085 systemd-journald[252]: Received SIGTERM from PID 1 (systemd). Mar 7 00:54:37.207245 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 00:54:37.207292 kernel: SELinux: policy capability open_perms=1 Mar 7 00:54:37.207326 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 00:54:37.207360 kernel: SELinux: policy capability always_check_network=0 Mar 7 00:54:37.207393 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 00:54:37.207425 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 00:54:37.207466 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 00:54:37.207499 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 00:54:37.207533 kernel: audit: type=1403 audit(1772844875.111:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 00:54:37.207570 systemd[1]: Successfully loaded SELinux policy in 84.623ms. Mar 7 00:54:37.207619 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.335ms. Mar 7 00:54:37.207656 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:54:37.207843 systemd[1]: Detected virtualization amazon. Mar 7 00:54:37.207888 systemd[1]: Detected architecture arm64. Mar 7 00:54:37.207925 systemd[1]: Detected first boot. Mar 7 00:54:37.207968 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:54:37.208005 zram_generator::config[1491]: No configuration found. Mar 7 00:54:37.208046 systemd[1]: Populated /etc with preset unit settings. Mar 7 00:54:37.208081 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 00:54:37.208115 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 00:54:37.208150 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 00:54:37.208186 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 00:54:37.208220 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 00:54:37.208260 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 00:54:37.208291 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 00:54:37.208324 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 00:54:37.208371 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 00:54:37.208407 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 00:54:37.208438 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 00:54:37.208468 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:54:37.208501 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:54:37.208533 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 00:54:37.208573 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 00:54:37.208606 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 00:54:37.208637 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:54:37.208669 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 7 00:54:37.208742 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:54:37.208779 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 00:54:37.208812 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 00:54:37.208843 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 00:54:37.208879 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 00:54:37.208913 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:54:37.208947 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:54:37.208979 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:54:37.209018 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:54:37.209053 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 00:54:37.209088 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 00:54:37.209119 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:54:37.209150 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:54:37.209192 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:54:37.209225 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 00:54:37.209283 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 00:54:37.209328 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 00:54:37.209363 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 00:54:37.209395 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 00:54:37.209427 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 00:54:37.209459 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 00:54:37.209492 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 00:54:37.209532 systemd[1]: Reached target machines.target - Containers. Mar 7 00:54:37.209569 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 00:54:37.209603 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:54:37.209635 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:54:37.209666 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 00:54:37.209735 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:54:37.209777 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:54:37.209812 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:54:37.209855 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 00:54:37.209888 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:54:37.209923 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 00:54:37.209958 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 00:54:37.210036 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 00:54:37.210080 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 00:54:37.210113 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 00:54:37.210145 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:54:37.217412 kernel: fuse: init (API version 7.39) Mar 7 00:54:37.217480 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:54:37.217513 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 00:54:37.217543 kernel: loop: module loaded Mar 7 00:54:37.217572 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 00:54:37.217603 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:54:37.217638 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 00:54:37.217670 systemd[1]: Stopped verity-setup.service. Mar 7 00:54:37.217734 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 00:54:37.217792 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 00:54:37.217835 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 00:54:37.217866 kernel: ACPI: bus type drm_connector registered Mar 7 00:54:37.217896 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 00:54:37.217926 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 00:54:37.217957 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 00:54:37.217991 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 00:54:37.218027 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:54:37.218058 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 00:54:37.218091 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 00:54:37.218171 systemd-journald[1587]: Collecting audit messages is disabled. Mar 7 00:54:37.218241 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:54:37.218274 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:54:37.218309 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:54:37.218339 systemd-journald[1587]: Journal started Mar 7 00:54:37.218387 systemd-journald[1587]: Runtime Journal (/run/log/journal/ec2abaa48ad55089138d8d654eced059) is 8.0M, max 75.3M, 67.3M free. Mar 7 00:54:36.473155 systemd[1]: Queued start job for default target multi-user.target. Mar 7 00:54:36.534604 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Mar 7 00:54:36.535520 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 00:54:37.223105 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:54:37.235192 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:54:37.237979 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:54:37.239964 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:54:37.246863 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 00:54:37.247192 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 00:54:37.255009 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:54:37.255538 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:54:37.262911 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:54:37.271208 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:54:37.278441 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 00:54:37.311594 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 00:54:37.327072 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 00:54:37.343034 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 00:54:37.350507 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 00:54:37.350575 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:54:37.360334 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 00:54:37.376223 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 00:54:37.384986 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 00:54:37.390075 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:54:37.398141 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 00:54:37.418181 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 00:54:37.421065 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:54:37.424585 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 00:54:37.428040 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:54:37.433019 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:54:37.454310 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 00:54:37.484115 systemd-journald[1587]: Time spent on flushing to /var/log/journal/ec2abaa48ad55089138d8d654eced059 is 169.578ms for 896 entries. Mar 7 00:54:37.484115 systemd-journald[1587]: System Journal (/var/log/journal/ec2abaa48ad55089138d8d654eced059) is 8.0M, max 195.6M, 187.6M free. Mar 7 00:54:37.691477 systemd-journald[1587]: Received client request to flush runtime journal. Mar 7 00:54:37.691655 kernel: loop0: detected capacity change from 0 to 52536 Mar 7 00:54:37.696236 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 00:54:37.469997 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:54:37.479829 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:54:37.482644 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 00:54:37.483462 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 00:54:37.490626 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 00:54:37.514148 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 00:54:37.558444 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 00:54:37.562288 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 00:54:37.586001 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 00:54:37.630929 systemd-tmpfiles[1622]: ACLs are not supported, ignoring. Mar 7 00:54:37.630954 systemd-tmpfiles[1622]: ACLs are not supported, ignoring. Mar 7 00:54:37.659160 udevadm[1627]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 7 00:54:37.661804 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:54:37.685012 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 00:54:37.689880 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:54:37.702825 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 00:54:37.731672 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 00:54:37.736855 kernel: loop1: detected capacity change from 0 to 197488 Mar 7 00:54:37.742874 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 00:54:37.818960 kernel: loop2: detected capacity change from 0 to 114432 Mar 7 00:54:37.821820 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 00:54:37.835157 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:54:37.911383 systemd-tmpfiles[1644]: ACLs are not supported, ignoring. Mar 7 00:54:37.912051 systemd-tmpfiles[1644]: ACLs are not supported, ignoring. Mar 7 00:54:37.927421 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:54:37.962434 kernel: loop3: detected capacity change from 0 to 114328 Mar 7 00:54:38.067847 kernel: loop4: detected capacity change from 0 to 52536 Mar 7 00:54:38.089773 kernel: loop5: detected capacity change from 0 to 197488 Mar 7 00:54:38.127122 kernel: loop6: detected capacity change from 0 to 114432 Mar 7 00:54:38.142789 kernel: loop7: detected capacity change from 0 to 114328 Mar 7 00:54:38.150832 (sd-merge)[1649]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Mar 7 00:54:38.153056 (sd-merge)[1649]: Merged extensions into '/usr'. Mar 7 00:54:38.164978 systemd[1]: Reloading requested from client PID 1621 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 00:54:38.165015 systemd[1]: Reloading... Mar 7 00:54:38.389725 zram_generator::config[1675]: No configuration found. Mar 7 00:54:38.790484 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:54:38.919658 systemd[1]: Reloading finished in 753 ms. Mar 7 00:54:38.961800 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 00:54:38.965628 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 00:54:38.981061 systemd[1]: Starting ensure-sysext.service... Mar 7 00:54:38.997052 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:54:39.013121 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:54:39.032808 systemd[1]: Reloading requested from client PID 1727 ('systemctl') (unit ensure-sysext.service)... Mar 7 00:54:39.032842 systemd[1]: Reloading... Mar 7 00:54:39.056866 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 00:54:39.058512 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 00:54:39.066093 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 00:54:39.070032 systemd-tmpfiles[1728]: ACLs are not supported, ignoring. Mar 7 00:54:39.070207 systemd-tmpfiles[1728]: ACLs are not supported, ignoring. Mar 7 00:54:39.084923 systemd-tmpfiles[1728]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:54:39.085774 systemd-tmpfiles[1728]: Skipping /boot Mar 7 00:54:39.141422 systemd-tmpfiles[1728]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:54:39.145751 systemd-tmpfiles[1728]: Skipping /boot Mar 7 00:54:39.150659 systemd-udevd[1729]: Using default interface naming scheme 'v255'. Mar 7 00:54:39.211719 ldconfig[1616]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 00:54:39.315139 zram_generator::config[1765]: No configuration found. Mar 7 00:54:39.494063 (udev-worker)[1773]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:54:39.744345 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:54:39.779726 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (1761) Mar 7 00:54:39.929310 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 7 00:54:39.930737 systemd[1]: Reloading finished in 897 ms. Mar 7 00:54:39.960051 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:54:39.964136 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 00:54:39.967424 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:54:40.060259 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 00:54:40.102586 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 7 00:54:40.122492 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:54:40.134314 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 00:54:40.141016 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:54:40.150253 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 00:54:40.160919 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:54:40.180578 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:54:40.190287 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:54:40.195418 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:54:40.201419 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 00:54:40.215292 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 00:54:40.228613 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:54:40.240573 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:54:40.251827 lvm[1928]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:54:40.256231 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 00:54:40.267325 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:54:40.280975 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:54:40.284162 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:54:40.291789 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:54:40.294817 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:54:40.313199 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:54:40.326475 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:54:40.338347 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:54:40.347345 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:54:40.358104 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:54:40.364147 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:54:40.364600 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 00:54:40.383247 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 00:54:40.391802 systemd[1]: Finished ensure-sysext.service. Mar 7 00:54:40.396979 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:54:40.397468 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:54:40.424860 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 00:54:40.425357 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:54:40.439174 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 00:54:40.465530 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 00:54:40.483787 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 00:54:40.507008 augenrules[1966]: No rules Mar 7 00:54:40.509008 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:54:40.510879 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:54:40.511334 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:54:40.514813 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:54:40.536040 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:54:40.536826 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:54:40.537438 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:54:40.542896 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:54:40.543262 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:54:40.555182 lvm[1961]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:54:40.569839 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 00:54:40.583434 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 00:54:40.634911 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 00:54:40.642808 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 00:54:40.658181 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 00:54:40.661638 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 00:54:40.666811 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 00:54:40.701358 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:54:40.785430 systemd-networkd[1942]: lo: Link UP Mar 7 00:54:40.785454 systemd-networkd[1942]: lo: Gained carrier Mar 7 00:54:40.788311 systemd-networkd[1942]: Enumeration completed Mar 7 00:54:40.788521 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:54:40.794214 systemd-networkd[1942]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:54:40.794240 systemd-networkd[1942]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:54:40.796950 systemd-networkd[1942]: eth0: Link UP Mar 7 00:54:40.797350 systemd-networkd[1942]: eth0: Gained carrier Mar 7 00:54:40.797385 systemd-networkd[1942]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:54:40.800075 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 00:54:40.808817 systemd-resolved[1943]: Positive Trust Anchors: Mar 7 00:54:40.808862 systemd-resolved[1943]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:54:40.808928 systemd-resolved[1943]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:54:40.811935 systemd-networkd[1942]: eth0: DHCPv4 address 172.31.23.166/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 7 00:54:40.831640 systemd-resolved[1943]: Defaulting to hostname 'linux'. Mar 7 00:54:40.835365 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:54:40.838081 systemd[1]: Reached target network.target - Network. Mar 7 00:54:40.840137 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:54:40.842845 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:54:40.845438 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 00:54:40.848313 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 00:54:40.851545 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 00:54:40.854190 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 00:54:40.857061 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 00:54:40.859924 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 00:54:40.859993 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:54:40.862102 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:54:40.865726 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 00:54:40.871224 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 00:54:40.883525 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 00:54:40.887203 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 00:54:40.890223 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:54:40.892771 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:54:40.895267 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:54:40.895555 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:54:40.906042 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 00:54:40.914120 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 00:54:40.923102 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 00:54:40.933512 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 00:54:40.948129 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 00:54:40.950656 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 00:54:40.954070 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 00:54:40.961052 systemd[1]: Started ntpd.service - Network Time Service. Mar 7 00:54:40.968893 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 00:54:40.983900 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 7 00:54:40.993078 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 00:54:40.999086 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 00:54:41.009058 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 00:54:41.014382 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 00:54:41.015332 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 00:54:41.018191 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 00:54:41.022508 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 00:54:41.043352 jq[1996]: false Mar 7 00:54:41.071048 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 00:54:41.071576 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 00:54:41.108441 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 00:54:41.108912 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 00:54:41.166948 (ntainerd)[2022]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 00:54:41.186273 dbus-daemon[1995]: [system] SELinux support is enabled Mar 7 00:54:41.203485 dbus-daemon[1995]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1942 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 7 00:54:41.204567 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 00:54:41.207343 extend-filesystems[1997]: Found loop4 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found loop5 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found loop6 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found loop7 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p1 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p2 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p3 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found usr Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p4 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p6 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p7 Mar 7 00:54:41.207343 extend-filesystems[1997]: Found nvme0n1p9 Mar 7 00:54:41.207343 extend-filesystems[1997]: Checking size of /dev/nvme0n1p9 Mar 7 00:54:41.216419 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 00:54:41.287066 jq[2006]: true Mar 7 00:54:41.253443 ntpd[1999]: ntpd 4.2.8p17@1.4004-o Fri Mar 6 22:14:43 UTC 2026 (1): Starting Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: ntpd 4.2.8p17@1.4004-o Fri Mar 6 22:14:43 UTC 2026 (1): Starting Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: ---------------------------------------------------- Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: ntp-4 is maintained by Network Time Foundation, Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: corporation. Support and training for ntp-4 are Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: available at https://www.nwtime.org/support Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: ---------------------------------------------------- Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: proto: precision = 0.108 usec (-23) Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: basedate set to 2026-02-22 Mar 7 00:54:41.289171 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: gps base set to 2026-02-22 (week 2407) Mar 7 00:54:41.216493 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 00:54:41.258889 ntpd[1999]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 7 00:54:41.227551 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 00:54:41.258933 ntpd[1999]: ---------------------------------------------------- Mar 7 00:54:41.227605 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 00:54:41.258955 ntpd[1999]: ntp-4 is maintained by Network Time Foundation, Mar 7 00:54:41.288331 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 00:54:41.258975 ntpd[1999]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 7 00:54:41.290808 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 00:54:41.259004 ntpd[1999]: corporation. Support and training for ntp-4 are Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listen and drop on 0 v6wildcard [::]:123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listen normally on 2 lo 127.0.0.1:123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listen normally on 3 eth0 172.31.23.166:123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listen normally on 4 lo [::1]:123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: bind(21) AF_INET6 fe80::4fe:fdff:fe3f:f905%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: unable to create socket on eth0 (5) for fe80::4fe:fdff:fe3f:f905%2#123 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: failed to init interface for address fe80::4fe:fdff:fe3f:f905%2 Mar 7 00:54:41.329149 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: Listening on routing socket on fd #21 for interface updates Mar 7 00:54:41.313976 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 7 00:54:41.259023 ntpd[1999]: available at https://www.nwtime.org/support Mar 7 00:54:41.259042 ntpd[1999]: ---------------------------------------------------- Mar 7 00:54:41.275445 ntpd[1999]: proto: precision = 0.108 usec (-23) Mar 7 00:54:41.277204 ntpd[1999]: basedate set to 2026-02-22 Mar 7 00:54:41.277256 ntpd[1999]: gps base set to 2026-02-22 (week 2407) Mar 7 00:54:41.286362 dbus-daemon[1995]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 7 00:54:41.309668 ntpd[1999]: Listen and drop on 0 v6wildcard [::]:123 Mar 7 00:54:41.309772 ntpd[1999]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 7 00:54:41.318772 ntpd[1999]: Listen normally on 2 lo 127.0.0.1:123 Mar 7 00:54:41.318891 ntpd[1999]: Listen normally on 3 eth0 172.31.23.166:123 Mar 7 00:54:41.318964 ntpd[1999]: Listen normally on 4 lo [::1]:123 Mar 7 00:54:41.319053 ntpd[1999]: bind(21) AF_INET6 fe80::4fe:fdff:fe3f:f905%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:54:41.319094 ntpd[1999]: unable to create socket on eth0 (5) for fe80::4fe:fdff:fe3f:f905%2#123 Mar 7 00:54:41.319124 ntpd[1999]: failed to init interface for address fe80::4fe:fdff:fe3f:f905%2 Mar 7 00:54:41.319184 ntpd[1999]: Listening on routing socket on fd #21 for interface updates Mar 7 00:54:41.346513 extend-filesystems[1997]: Resized partition /dev/nvme0n1p9 Mar 7 00:54:41.351131 jq[2033]: true Mar 7 00:54:41.362786 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:54:41.362786 ntpd[1999]: 7 Mar 00:54:41 ntpd[1999]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:54:41.362961 extend-filesystems[2044]: resize2fs 1.47.1 (20-May-2024) Mar 7 00:54:41.355855 ntpd[1999]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:54:41.355934 ntpd[1999]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:54:41.394619 tar[2028]: linux-arm64/LICENSE Mar 7 00:54:41.394619 tar[2028]: linux-arm64/helm Mar 7 00:54:41.399531 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 7 00:54:41.412061 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 3587067 blocks Mar 7 00:54:41.412053 systemd-logind[2004]: Watching system buttons on /dev/input/event0 (Power Button) Mar 7 00:54:41.412090 systemd-logind[2004]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 7 00:54:41.412561 systemd-logind[2004]: New seat seat0. Mar 7 00:54:41.414783 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 00:54:41.533703 update_engine[2005]: I20260307 00:54:41.532176 2005 main.cc:92] Flatcar Update Engine starting Mar 7 00:54:41.550718 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (1761) Mar 7 00:54:41.547330 systemd[1]: Started update-engine.service - Update Engine. Mar 7 00:54:41.554595 update_engine[2005]: I20260307 00:54:41.553966 2005 update_check_scheduler.cc:74] Next update check in 4m44s Mar 7 00:54:41.584378 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 3587067 Mar 7 00:54:41.618138 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 00:54:41.633269 extend-filesystems[2044]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 7 00:54:41.633269 extend-filesystems[2044]: old_desc_blocks = 1, new_desc_blocks = 2 Mar 7 00:54:41.633269 extend-filesystems[2044]: The filesystem on /dev/nvme0n1p9 is now 3587067 (4k) blocks long. Mar 7 00:54:41.649508 extend-filesystems[1997]: Resized filesystem in /dev/nvme0n1p9 Mar 7 00:54:41.652168 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 00:54:41.654380 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 00:54:41.680828 coreos-metadata[1994]: Mar 07 00:54:41.680 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 7 00:54:41.689324 coreos-metadata[1994]: Mar 07 00:54:41.687 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Mar 7 00:54:41.693164 coreos-metadata[1994]: Mar 07 00:54:41.693 INFO Fetch successful Mar 7 00:54:41.693164 coreos-metadata[1994]: Mar 07 00:54:41.693 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.697 INFO Fetch successful Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.697 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.698 INFO Fetch successful Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.698 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.699 INFO Fetch successful Mar 7 00:54:41.700269 coreos-metadata[1994]: Mar 07 00:54:41.699 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Mar 7 00:54:41.701167 coreos-metadata[1994]: Mar 07 00:54:41.700 INFO Fetch failed with 404: resource not found Mar 7 00:54:41.701167 coreos-metadata[1994]: Mar 07 00:54:41.700 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Mar 7 00:54:41.702669 coreos-metadata[1994]: Mar 07 00:54:41.702 INFO Fetch successful Mar 7 00:54:41.702669 coreos-metadata[1994]: Mar 07 00:54:41.702 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Mar 7 00:54:41.705744 coreos-metadata[1994]: Mar 07 00:54:41.704 INFO Fetch successful Mar 7 00:54:41.705744 coreos-metadata[1994]: Mar 07 00:54:41.705 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Mar 7 00:54:41.712300 coreos-metadata[1994]: Mar 07 00:54:41.712 INFO Fetch successful Mar 7 00:54:41.712300 coreos-metadata[1994]: Mar 07 00:54:41.712 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Mar 7 00:54:41.712810 bash[2084]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:54:41.718961 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 00:54:41.722076 coreos-metadata[1994]: Mar 07 00:54:41.719 INFO Fetch successful Mar 7 00:54:41.722076 coreos-metadata[1994]: Mar 07 00:54:41.720 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Mar 7 00:54:41.730049 coreos-metadata[1994]: Mar 07 00:54:41.729 INFO Fetch successful Mar 7 00:54:41.733273 systemd[1]: Starting sshkeys.service... Mar 7 00:54:41.874072 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 7 00:54:41.880325 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 7 00:54:41.966898 systemd-networkd[1942]: eth0: Gained IPv6LL Mar 7 00:54:41.996392 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 00:54:42.001414 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 00:54:42.011409 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 00:54:42.014659 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 00:54:42.023929 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Mar 7 00:54:42.039384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:54:42.046418 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 00:54:42.048981 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 00:54:42.061941 dbus-daemon[1995]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 7 00:54:42.067209 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 7 00:54:42.075909 dbus-daemon[1995]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2041 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 7 00:54:42.090392 systemd[1]: Starting polkit.service - Authorization Manager... Mar 7 00:54:42.100990 locksmithd[2064]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 00:54:42.231498 polkitd[2147]: Started polkitd version 121 Mar 7 00:54:42.255489 polkitd[2147]: Loading rules from directory /etc/polkit-1/rules.d Mar 7 00:54:42.257911 polkitd[2147]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 7 00:54:42.259670 polkitd[2147]: Finished loading, compiling and executing 2 rules Mar 7 00:54:42.263139 dbus-daemon[1995]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 7 00:54:42.263502 systemd[1]: Started polkit.service - Authorization Manager. Mar 7 00:54:42.267834 polkitd[2147]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 7 00:54:42.354982 systemd-hostnamed[2041]: Hostname set to (transient) Mar 7 00:54:42.359631 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 00:54:42.370723 systemd-resolved[1943]: System hostname changed to 'ip-172-31-23-166'. Mar 7 00:54:42.433455 amazon-ssm-agent[2136]: Initializing new seelog logger Mar 7 00:54:42.433455 amazon-ssm-agent[2136]: New Seelog Logger Creation Complete Mar 7 00:54:42.433455 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.433455 amazon-ssm-agent[2136]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.433455 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 processing appconfig overrides Mar 7 00:54:42.446488 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.446488 amazon-ssm-agent[2136]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.446488 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 processing appconfig overrides Mar 7 00:54:42.446488 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO Proxy environment variables: Mar 7 00:54:42.448815 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.448815 amazon-ssm-agent[2136]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.452889 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 processing appconfig overrides Mar 7 00:54:42.467258 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.467258 amazon-ssm-agent[2136]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:54:42.467258 amazon-ssm-agent[2136]: 2026/03/07 00:54:42 processing appconfig overrides Mar 7 00:54:42.547817 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO https_proxy: Mar 7 00:54:42.560815 coreos-metadata[2117]: Mar 07 00:54:42.560 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 7 00:54:42.567661 coreos-metadata[2117]: Mar 07 00:54:42.564 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Mar 7 00:54:42.571006 coreos-metadata[2117]: Mar 07 00:54:42.568 INFO Fetch successful Mar 7 00:54:42.571006 coreos-metadata[2117]: Mar 07 00:54:42.568 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 7 00:54:42.575720 coreos-metadata[2117]: Mar 07 00:54:42.573 INFO Fetch successful Mar 7 00:54:42.586953 unknown[2117]: wrote ssh authorized keys file for user: core Mar 7 00:54:42.654731 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO http_proxy: Mar 7 00:54:42.664733 update-ssh-keys[2210]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:54:42.666796 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 7 00:54:42.681339 systemd[1]: Finished sshkeys.service. Mar 7 00:54:42.757232 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO no_proxy: Mar 7 00:54:42.767394 containerd[2022]: time="2026-03-07T00:54:42.767224537Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 00:54:42.868791 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO Checking if agent identity type OnPrem can be assumed Mar 7 00:54:42.922417 containerd[2022]: time="2026-03-07T00:54:42.922340126Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.928113 containerd[2022]: time="2026-03-07T00:54:42.928014746Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:54:42.928313 containerd[2022]: time="2026-03-07T00:54:42.928271270Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 00:54:42.928454 containerd[2022]: time="2026-03-07T00:54:42.928416398Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.930548162Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.930626282Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.930886526Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.930933386Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.931302746Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.931350074Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.931385138Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.931411802Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.931775 containerd[2022]: time="2026-03-07T00:54:42.931643018Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.932986 containerd[2022]: time="2026-03-07T00:54:42.932917418Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:54:42.934726 containerd[2022]: time="2026-03-07T00:54:42.934297742Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:54:42.934726 containerd[2022]: time="2026-03-07T00:54:42.934363298Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 00:54:42.934726 containerd[2022]: time="2026-03-07T00:54:42.934629950Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 00:54:42.935642 containerd[2022]: time="2026-03-07T00:54:42.935282270Z" level=info msg="metadata content store policy set" policy=shared Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.950513810Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.950739830Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.950884238Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.950934350Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.950970290Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 00:54:42.951570 containerd[2022]: time="2026-03-07T00:54:42.951282338Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953392682Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953780474Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953828258Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953862530Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953895266Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953937482Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953968058Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.953999894Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954031802Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954068150Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954103550Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954132938Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954175838Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.955706 containerd[2022]: time="2026-03-07T00:54:42.954210422Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954240602Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954272714Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954302714Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954334622Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954364790Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954398678Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954433058Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954469346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954501926Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954535046Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954565526Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.956416 containerd[2022]: time="2026-03-07T00:54:42.954619262Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.954670430Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.959631698Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.959671274Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961078910Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961154774Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961184930Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961233914Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961267346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961304294Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961330790Z" level=info msg="NRI interface is disabled by configuration." Mar 7 00:54:42.962734 containerd[2022]: time="2026-03-07T00:54:42.961358822Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 00:54:42.963356 containerd[2022]: time="2026-03-07T00:54:42.962086130Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 00:54:42.963356 containerd[2022]: time="2026-03-07T00:54:42.962215790Z" level=info msg="Connect containerd service" Mar 7 00:54:42.963356 containerd[2022]: time="2026-03-07T00:54:42.962296754Z" level=info msg="using legacy CRI server" Mar 7 00:54:42.963356 containerd[2022]: time="2026-03-07T00:54:42.962316038Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 00:54:42.963356 containerd[2022]: time="2026-03-07T00:54:42.962469074Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 00:54:42.969970 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO Checking if agent identity type EC2 can be assumed Mar 7 00:54:42.970111 containerd[2022]: time="2026-03-07T00:54:42.968894150Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.971927750Z" level=info msg="Start subscribing containerd event" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.972052310Z" level=info msg="Start recovering state" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.972199538Z" level=info msg="Start event monitor" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.972228122Z" level=info msg="Start snapshots syncer" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.972250286Z" level=info msg="Start cni network conf syncer for default" Mar 7 00:54:42.972915 containerd[2022]: time="2026-03-07T00:54:42.972269546Z" level=info msg="Start streaming server" Mar 7 00:54:42.976377 containerd[2022]: time="2026-03-07T00:54:42.976319966Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 00:54:42.981234 containerd[2022]: time="2026-03-07T00:54:42.977584574Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 00:54:42.990873 containerd[2022]: time="2026-03-07T00:54:42.987276770Z" level=info msg="containerd successfully booted in 0.227860s" Mar 7 00:54:42.987432 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 00:54:43.065846 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO Agent will take identity from EC2 Mar 7 00:54:43.165200 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:54:43.264452 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:54:43.364792 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:54:43.465899 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Mar 7 00:54:43.566142 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Mar 7 00:54:43.672727 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] Starting Core Agent Mar 7 00:54:43.770202 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [amazon-ssm-agent] registrar detected. Attempting registration Mar 7 00:54:43.878586 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [Registrar] Starting registrar module Mar 7 00:54:43.895510 amazon-ssm-agent[2136]: 2026-03-07 00:54:42 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Mar 7 00:54:43.895510 amazon-ssm-agent[2136]: 2026-03-07 00:54:43 INFO [EC2Identity] EC2 registration was successful. Mar 7 00:54:43.895676 amazon-ssm-agent[2136]: 2026-03-07 00:54:43 INFO [CredentialRefresher] credentialRefresher has started Mar 7 00:54:43.895676 amazon-ssm-agent[2136]: 2026-03-07 00:54:43 INFO [CredentialRefresher] Starting credentials refresher loop Mar 7 00:54:43.895676 amazon-ssm-agent[2136]: 2026-03-07 00:54:43 INFO EC2RoleProvider Successfully connected with instance profile role credentials Mar 7 00:54:43.922780 tar[2028]: linux-arm64/README.md Mar 7 00:54:43.961542 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 00:54:43.980920 amazon-ssm-agent[2136]: 2026-03-07 00:54:43 INFO [CredentialRefresher] Next credential rotation will be in 30.041658513166666 minutes Mar 7 00:54:44.267148 ntpd[1999]: Listen normally on 6 eth0 [fe80::4fe:fdff:fe3f:f905%2]:123 Mar 7 00:54:44.270208 ntpd[1999]: 7 Mar 00:54:44 ntpd[1999]: Listen normally on 6 eth0 [fe80::4fe:fdff:fe3f:f905%2]:123 Mar 7 00:54:44.407261 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:54:44.420784 (kubelet)[2226]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:54:44.960921 amazon-ssm-agent[2136]: 2026-03-07 00:54:44 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Mar 7 00:54:45.063738 amazon-ssm-agent[2136]: 2026-03-07 00:54:44 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2232) started Mar 7 00:54:45.163196 amazon-ssm-agent[2136]: 2026-03-07 00:54:44 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Mar 7 00:54:45.277771 kubelet[2226]: E0307 00:54:45.276528 2226 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:54:45.283101 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:54:45.283889 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:54:45.286874 systemd[1]: kubelet.service: Consumed 1.321s CPU time. Mar 7 00:54:45.583031 sshd_keygen[2035]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 00:54:45.621886 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 00:54:45.634262 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 00:54:45.650870 systemd[1]: Started sshd@0-172.31.23.166:22-20.161.92.111:53370.service - OpenSSH per-connection server daemon (20.161.92.111:53370). Mar 7 00:54:45.657308 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 00:54:45.658163 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 00:54:45.676214 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 00:54:45.702914 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 00:54:45.720365 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 00:54:45.727289 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 7 00:54:45.733025 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 00:54:45.737618 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 00:54:45.743784 systemd[1]: Startup finished in 1.244s (kernel) + 9.256s (initrd) + 10.714s (userspace) = 21.216s. Mar 7 00:54:46.176735 sshd[2253]: Accepted publickey for core from 20.161.92.111 port 53370 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:46.179570 sshd[2253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:46.196465 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 00:54:46.204198 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 00:54:46.209505 systemd-logind[2004]: New session 1 of user core. Mar 7 00:54:46.238271 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 00:54:46.247266 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 00:54:46.264162 (systemd)[2268]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 00:54:46.507306 systemd[2268]: Queued start job for default target default.target. Mar 7 00:54:46.521152 systemd[2268]: Created slice app.slice - User Application Slice. Mar 7 00:54:46.521249 systemd[2268]: Reached target paths.target - Paths. Mar 7 00:54:46.521284 systemd[2268]: Reached target timers.target - Timers. Mar 7 00:54:46.531911 systemd[2268]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 00:54:46.555361 systemd[2268]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 00:54:46.555664 systemd[2268]: Reached target sockets.target - Sockets. Mar 7 00:54:46.555759 systemd[2268]: Reached target basic.target - Basic System. Mar 7 00:54:46.555877 systemd[2268]: Reached target default.target - Main User Target. Mar 7 00:54:46.556027 systemd[2268]: Startup finished in 279ms. Mar 7 00:54:46.556342 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 00:54:46.564022 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 00:54:46.941112 systemd[1]: Started sshd@1-172.31.23.166:22-20.161.92.111:53382.service - OpenSSH per-connection server daemon (20.161.92.111:53382). Mar 7 00:54:47.457506 sshd[2279]: Accepted publickey for core from 20.161.92.111 port 53382 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:47.459204 sshd[2279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:47.466653 systemd-logind[2004]: New session 2 of user core. Mar 7 00:54:47.473936 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 00:54:47.816515 sshd[2279]: pam_unix(sshd:session): session closed for user core Mar 7 00:54:47.824841 systemd[1]: sshd@1-172.31.23.166:22-20.161.92.111:53382.service: Deactivated successfully. Mar 7 00:54:47.828128 systemd[1]: session-2.scope: Deactivated successfully. Mar 7 00:54:47.831074 systemd-logind[2004]: Session 2 logged out. Waiting for processes to exit. Mar 7 00:54:47.833128 systemd-logind[2004]: Removed session 2. Mar 7 00:54:47.910215 systemd[1]: Started sshd@2-172.31.23.166:22-20.161.92.111:53386.service - OpenSSH per-connection server daemon (20.161.92.111:53386). Mar 7 00:54:48.469049 systemd-resolved[1943]: Clock change detected. Flushing caches. Mar 7 00:54:48.622255 sshd[2286]: Accepted publickey for core from 20.161.92.111 port 53386 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:48.623993 sshd[2286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:48.631363 systemd-logind[2004]: New session 3 of user core. Mar 7 00:54:48.643739 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 00:54:48.970247 sshd[2286]: pam_unix(sshd:session): session closed for user core Mar 7 00:54:48.976679 systemd[1]: sshd@2-172.31.23.166:22-20.161.92.111:53386.service: Deactivated successfully. Mar 7 00:54:48.976960 systemd-logind[2004]: Session 3 logged out. Waiting for processes to exit. Mar 7 00:54:48.980268 systemd[1]: session-3.scope: Deactivated successfully. Mar 7 00:54:48.984013 systemd-logind[2004]: Removed session 3. Mar 7 00:54:49.059629 systemd[1]: Started sshd@3-172.31.23.166:22-20.161.92.111:53402.service - OpenSSH per-connection server daemon (20.161.92.111:53402). Mar 7 00:54:49.575442 sshd[2293]: Accepted publickey for core from 20.161.92.111 port 53402 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:49.577148 sshd[2293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:49.586489 systemd-logind[2004]: New session 4 of user core. Mar 7 00:54:49.597643 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 00:54:49.934593 sshd[2293]: pam_unix(sshd:session): session closed for user core Mar 7 00:54:49.939600 systemd[1]: sshd@3-172.31.23.166:22-20.161.92.111:53402.service: Deactivated successfully. Mar 7 00:54:49.942745 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 00:54:49.945947 systemd-logind[2004]: Session 4 logged out. Waiting for processes to exit. Mar 7 00:54:49.948139 systemd-logind[2004]: Removed session 4. Mar 7 00:54:50.023500 systemd[1]: Started sshd@4-172.31.23.166:22-20.161.92.111:53418.service - OpenSSH per-connection server daemon (20.161.92.111:53418). Mar 7 00:54:50.528370 sshd[2300]: Accepted publickey for core from 20.161.92.111 port 53418 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:50.530043 sshd[2300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:50.537478 systemd-logind[2004]: New session 5 of user core. Mar 7 00:54:50.546665 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 00:54:50.843899 sudo[2303]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 00:54:50.844598 sudo[2303]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:54:50.860465 sudo[2303]: pam_unix(sudo:session): session closed for user root Mar 7 00:54:50.939247 sshd[2300]: pam_unix(sshd:session): session closed for user core Mar 7 00:54:50.947079 systemd[1]: sshd@4-172.31.23.166:22-20.161.92.111:53418.service: Deactivated successfully. Mar 7 00:54:50.947441 systemd-logind[2004]: Session 5 logged out. Waiting for processes to exit. Mar 7 00:54:50.951858 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 00:54:50.956200 systemd-logind[2004]: Removed session 5. Mar 7 00:54:51.035893 systemd[1]: Started sshd@5-172.31.23.166:22-20.161.92.111:55630.service - OpenSSH per-connection server daemon (20.161.92.111:55630). Mar 7 00:54:51.536317 sshd[2308]: Accepted publickey for core from 20.161.92.111 port 55630 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:51.539065 sshd[2308]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:51.548342 systemd-logind[2004]: New session 6 of user core. Mar 7 00:54:51.553710 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 00:54:51.818590 sudo[2312]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 00:54:51.819234 sudo[2312]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:54:51.825746 sudo[2312]: pam_unix(sudo:session): session closed for user root Mar 7 00:54:51.835825 sudo[2311]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 00:54:51.836503 sudo[2311]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:54:51.863272 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 00:54:51.866491 auditctl[2315]: No rules Mar 7 00:54:51.867542 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 00:54:51.867882 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 00:54:51.875416 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:54:51.923733 augenrules[2333]: No rules Mar 7 00:54:51.928489 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:54:51.931303 sudo[2311]: pam_unix(sudo:session): session closed for user root Mar 7 00:54:52.009874 sshd[2308]: pam_unix(sshd:session): session closed for user core Mar 7 00:54:52.015977 systemd[1]: sshd@5-172.31.23.166:22-20.161.92.111:55630.service: Deactivated successfully. Mar 7 00:54:52.019147 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 00:54:52.020288 systemd-logind[2004]: Session 6 logged out. Waiting for processes to exit. Mar 7 00:54:52.022619 systemd-logind[2004]: Removed session 6. Mar 7 00:54:52.102591 systemd[1]: Started sshd@6-172.31.23.166:22-20.161.92.111:55640.service - OpenSSH per-connection server daemon (20.161.92.111:55640). Mar 7 00:54:52.615211 sshd[2341]: Accepted publickey for core from 20.161.92.111 port 55640 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:54:52.617766 sshd[2341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:54:52.624859 systemd-logind[2004]: New session 7 of user core. Mar 7 00:54:52.637646 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 00:54:52.897596 sudo[2344]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 00:54:52.898226 sudo[2344]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:54:53.531900 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 00:54:53.546226 (dockerd)[2359]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 00:54:54.078721 dockerd[2359]: time="2026-03-07T00:54:54.078598731Z" level=info msg="Starting up" Mar 7 00:54:54.284842 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport240923505-merged.mount: Deactivated successfully. Mar 7 00:54:54.323805 dockerd[2359]: time="2026-03-07T00:54:54.323743001Z" level=info msg="Loading containers: start." Mar 7 00:54:54.554458 kernel: Initializing XFRM netlink socket Mar 7 00:54:54.644318 (udev-worker)[2382]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:54:54.738260 systemd-networkd[1942]: docker0: Link UP Mar 7 00:54:54.769889 dockerd[2359]: time="2026-03-07T00:54:54.769811647Z" level=info msg="Loading containers: done." Mar 7 00:54:54.814650 dockerd[2359]: time="2026-03-07T00:54:54.814437103Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 00:54:54.814877 dockerd[2359]: time="2026-03-07T00:54:54.814660303Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 00:54:54.814940 dockerd[2359]: time="2026-03-07T00:54:54.814911775Z" level=info msg="Daemon has completed initialization" Mar 7 00:54:54.882581 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 00:54:54.882954 dockerd[2359]: time="2026-03-07T00:54:54.882173467Z" level=info msg="API listen on /run/docker.sock" Mar 7 00:54:55.604799 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 00:54:55.616822 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:54:55.665633 containerd[2022]: time="2026-03-07T00:54:55.665038099Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 7 00:54:56.022751 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:54:56.026942 (kubelet)[2510]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:54:56.103948 kubelet[2510]: E0307 00:54:56.103820 2510 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:54:56.111882 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:54:56.112504 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:54:56.417670 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1922669153.mount: Deactivated successfully. Mar 7 00:54:57.852327 containerd[2022]: time="2026-03-07T00:54:57.852259294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:57.855145 containerd[2022]: time="2026-03-07T00:54:57.855094906Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=24701796" Mar 7 00:54:57.856738 containerd[2022]: time="2026-03-07T00:54:57.856693930Z" level=info msg="ImageCreate event name:\"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:57.862995 containerd[2022]: time="2026-03-07T00:54:57.862915198Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:57.867088 containerd[2022]: time="2026-03-07T00:54:57.867015334Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"24698395\" in 2.201875715s" Mar 7 00:54:57.867329 containerd[2022]: time="2026-03-07T00:54:57.867270670Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\"" Mar 7 00:54:57.869206 containerd[2022]: time="2026-03-07T00:54:57.869133694Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 7 00:54:59.302211 containerd[2022]: time="2026-03-07T00:54:59.302152605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:59.304600 containerd[2022]: time="2026-03-07T00:54:59.304537701Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=19063039" Mar 7 00:54:59.305644 containerd[2022]: time="2026-03-07T00:54:59.305602209Z" level=info msg="ImageCreate event name:\"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:59.311829 containerd[2022]: time="2026-03-07T00:54:59.311742525Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:59.314228 containerd[2022]: time="2026-03-07T00:54:59.314178021Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"20675140\" in 1.444706071s" Mar 7 00:54:59.314559 containerd[2022]: time="2026-03-07T00:54:59.314375073Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\"" Mar 7 00:54:59.316415 containerd[2022]: time="2026-03-07T00:54:59.316346541Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 7 00:55:00.630422 containerd[2022]: time="2026-03-07T00:55:00.630143376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:00.635418 containerd[2022]: time="2026-03-07T00:55:00.634922664Z" level=info msg="ImageCreate event name:\"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:00.636171 containerd[2022]: time="2026-03-07T00:55:00.635596332Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=13797901" Mar 7 00:55:00.641633 containerd[2022]: time="2026-03-07T00:55:00.641579592Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:00.645874 containerd[2022]: time="2026-03-07T00:55:00.645816216Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"15410020\" in 1.329083911s" Mar 7 00:55:00.646059 containerd[2022]: time="2026-03-07T00:55:00.646028340Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\"" Mar 7 00:55:00.647019 containerd[2022]: time="2026-03-07T00:55:00.646812108Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 7 00:55:02.055537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3989459085.mount: Deactivated successfully. Mar 7 00:55:02.435817 containerd[2022]: time="2026-03-07T00:55:02.435722449Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:02.438593 containerd[2022]: time="2026-03-07T00:55:02.438524797Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=22329583" Mar 7 00:55:02.440699 containerd[2022]: time="2026-03-07T00:55:02.440621053Z" level=info msg="ImageCreate event name:\"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:02.445426 containerd[2022]: time="2026-03-07T00:55:02.445301173Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:02.450865 containerd[2022]: time="2026-03-07T00:55:02.448612117Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"22328602\" in 1.801737093s" Mar 7 00:55:02.450865 containerd[2022]: time="2026-03-07T00:55:02.448698949Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\"" Mar 7 00:55:02.451805 containerd[2022]: time="2026-03-07T00:55:02.451750981Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 7 00:55:03.050435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3968939286.mount: Deactivated successfully. Mar 7 00:55:04.353331 containerd[2022]: time="2026-03-07T00:55:04.353233358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.355713 containerd[2022]: time="2026-03-07T00:55:04.355458770Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172211" Mar 7 00:55:04.357954 containerd[2022]: time="2026-03-07T00:55:04.357879710Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.364110 containerd[2022]: time="2026-03-07T00:55:04.364034678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.367125 containerd[2022]: time="2026-03-07T00:55:04.366859958Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.914871053s" Mar 7 00:55:04.367125 containerd[2022]: time="2026-03-07T00:55:04.366921422Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Mar 7 00:55:04.368425 containerd[2022]: time="2026-03-07T00:55:04.367702682Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 7 00:55:04.865322 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2674374522.mount: Deactivated successfully. Mar 7 00:55:04.879495 containerd[2022]: time="2026-03-07T00:55:04.879418169Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.881426 containerd[2022]: time="2026-03-07T00:55:04.881238305Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Mar 7 00:55:04.883873 containerd[2022]: time="2026-03-07T00:55:04.883798877Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.890068 containerd[2022]: time="2026-03-07T00:55:04.889992845Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:04.892207 containerd[2022]: time="2026-03-07T00:55:04.891874001Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 524.109879ms" Mar 7 00:55:04.892207 containerd[2022]: time="2026-03-07T00:55:04.891925973Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 7 00:55:04.893293 containerd[2022]: time="2026-03-07T00:55:04.893230121Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 7 00:55:05.493148 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2414894848.mount: Deactivated successfully. Mar 7 00:55:06.354810 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 00:55:06.368834 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:06.766917 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:06.770864 (kubelet)[2717]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:55:06.863424 kubelet[2717]: E0307 00:55:06.861650 2717 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:55:06.871098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:55:06.871490 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:55:07.123361 containerd[2022]: time="2026-03-07T00:55:07.121424272Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:07.126583 containerd[2022]: time="2026-03-07T00:55:07.126533500Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21738165" Mar 7 00:55:07.129483 containerd[2022]: time="2026-03-07T00:55:07.129435160Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:07.137660 containerd[2022]: time="2026-03-07T00:55:07.137601124Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:07.140134 containerd[2022]: time="2026-03-07T00:55:07.139134064Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 2.245844627s" Mar 7 00:55:07.141111 containerd[2022]: time="2026-03-07T00:55:07.141074044Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Mar 7 00:55:11.579424 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:11.591926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:11.659951 systemd[1]: Reloading requested from client PID 2755 ('systemctl') (unit session-7.scope)... Mar 7 00:55:11.660153 systemd[1]: Reloading... Mar 7 00:55:11.931539 zram_generator::config[2802]: No configuration found. Mar 7 00:55:12.160790 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:55:12.342578 systemd[1]: Reloading finished in 681 ms. Mar 7 00:55:12.436817 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 7 00:55:12.437038 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 7 00:55:12.438682 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:12.452602 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:12.592597 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 7 00:55:12.808684 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:12.825922 (kubelet)[2862]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:55:12.902153 kubelet[2862]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:55:13.590648 kubelet[2862]: I0307 00:55:13.590530 2862 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 00:55:13.590648 kubelet[2862]: I0307 00:55:13.590627 2862 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:55:13.590888 kubelet[2862]: I0307 00:55:13.590704 2862 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:55:13.590888 kubelet[2862]: I0307 00:55:13.590736 2862 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:55:13.591371 kubelet[2862]: I0307 00:55:13.591323 2862 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 00:55:13.604893 kubelet[2862]: E0307 00:55:13.604818 2862 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://172.31.23.166:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.23.166:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 00:55:13.607291 kubelet[2862]: I0307 00:55:13.607230 2862 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:55:13.616328 kubelet[2862]: E0307 00:55:13.616248 2862 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:55:13.616541 kubelet[2862]: I0307 00:55:13.616373 2862 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 00:55:13.622019 kubelet[2862]: I0307 00:55:13.621970 2862 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:55:13.622576 kubelet[2862]: I0307 00:55:13.622529 2862 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:55:13.622885 kubelet[2862]: I0307 00:55:13.622577 2862 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-23-166","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:55:13.623070 kubelet[2862]: I0307 00:55:13.622894 2862 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 00:55:13.623070 kubelet[2862]: I0307 00:55:13.622912 2862 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 00:55:13.623070 kubelet[2862]: I0307 00:55:13.623066 2862 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:55:13.627993 kubelet[2862]: I0307 00:55:13.627936 2862 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 00:55:13.628429 kubelet[2862]: I0307 00:55:13.628376 2862 kubelet.go:482] "Attempting to sync node with API server" Mar 7 00:55:13.628538 kubelet[2862]: I0307 00:55:13.628445 2862 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:55:13.628538 kubelet[2862]: I0307 00:55:13.628478 2862 kubelet.go:394] "Adding apiserver pod source" Mar 7 00:55:13.628538 kubelet[2862]: I0307 00:55:13.628497 2862 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:55:13.636141 kubelet[2862]: I0307 00:55:13.636080 2862 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:55:13.638542 kubelet[2862]: I0307 00:55:13.638459 2862 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:55:13.638542 kubelet[2862]: I0307 00:55:13.638532 2862 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:55:13.638744 kubelet[2862]: W0307 00:55:13.638631 2862 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 00:55:13.649231 kubelet[2862]: I0307 00:55:13.649195 2862 server.go:1257] "Started kubelet" Mar 7 00:55:13.649668 kubelet[2862]: I0307 00:55:13.649608 2862 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:55:13.651318 kubelet[2862]: I0307 00:55:13.651276 2862 server.go:317] "Adding debug handlers to kubelet server" Mar 7 00:55:13.657111 kubelet[2862]: I0307 00:55:13.657048 2862 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:55:13.657363 kubelet[2862]: I0307 00:55:13.657318 2862 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:55:13.657977 kubelet[2862]: I0307 00:55:13.657930 2862 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:55:13.660632 kubelet[2862]: E0307 00:55:13.658444 2862 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.23.166:6443/api/v1/namespaces/default/events\": dial tcp 172.31.23.166:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-23-166.189a690e57ea6c7d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-23-166,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-23-166,},FirstTimestamp:2026-03-07 00:55:13.649151101 +0000 UTC m=+0.817255961,LastTimestamp:2026-03-07 00:55:13.649151101 +0000 UTC m=+0.817255961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-23-166,}" Mar 7 00:55:13.663030 kubelet[2862]: I0307 00:55:13.662985 2862 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 00:55:13.663490 kubelet[2862]: I0307 00:55:13.663447 2862 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:55:13.668761 kubelet[2862]: E0307 00:55:13.668705 2862 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:55:13.669364 kubelet[2862]: I0307 00:55:13.669336 2862 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 00:55:13.669803 kubelet[2862]: E0307 00:55:13.669742 2862 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-23-166\" not found" Mar 7 00:55:13.669991 kubelet[2862]: E0307 00:55:13.669933 2862 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-166?timeout=10s\": dial tcp 172.31.23.166:6443: connect: connection refused" interval="200ms" Mar 7 00:55:13.670362 kubelet[2862]: I0307 00:55:13.670253 2862 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:55:13.671065 kubelet[2862]: I0307 00:55:13.670999 2862 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:55:13.672438 kubelet[2862]: I0307 00:55:13.671766 2862 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:55:13.672438 kubelet[2862]: I0307 00:55:13.671939 2862 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:55:13.673913 kubelet[2862]: I0307 00:55:13.673866 2862 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:55:13.712185 kubelet[2862]: I0307 00:55:13.711786 2862 cpu_manager.go:225] "Starting" policy="none" Mar 7 00:55:13.712185 kubelet[2862]: I0307 00:55:13.711813 2862 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 00:55:13.712185 kubelet[2862]: I0307 00:55:13.711844 2862 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 00:55:13.715981 kubelet[2862]: I0307 00:55:13.715932 2862 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:55:13.719343 kubelet[2862]: I0307 00:55:13.719291 2862 policy_none.go:50] "Start" Mar 7 00:55:13.719343 kubelet[2862]: I0307 00:55:13.719330 2862 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:55:13.719699 kubelet[2862]: I0307 00:55:13.719355 2862 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:55:13.720878 kubelet[2862]: I0307 00:55:13.720282 2862 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:55:13.720878 kubelet[2862]: I0307 00:55:13.720320 2862 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 00:55:13.720878 kubelet[2862]: I0307 00:55:13.720356 2862 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 00:55:13.722254 kubelet[2862]: E0307 00:55:13.722127 2862 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:55:13.724303 kubelet[2862]: I0307 00:55:13.724272 2862 policy_none.go:44] "Start" Mar 7 00:55:13.737825 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 00:55:13.752005 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 00:55:13.758517 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 00:55:13.770017 kubelet[2862]: E0307 00:55:13.769955 2862 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:55:13.770324 kubelet[2862]: I0307 00:55:13.770276 2862 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 00:55:13.770428 kubelet[2862]: I0307 00:55:13.770312 2862 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:55:13.771289 kubelet[2862]: I0307 00:55:13.771237 2862 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 00:55:13.771758 kubelet[2862]: E0307 00:55:13.771579 2862 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-23-166\" not found" Mar 7 00:55:13.775177 kubelet[2862]: E0307 00:55:13.775104 2862 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:55:13.775356 kubelet[2862]: E0307 00:55:13.775322 2862 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-23-166\" not found" Mar 7 00:55:13.843899 systemd[1]: Created slice kubepods-burstable-pod0fe58eb067703b4a76989df7dbb8bb53.slice - libcontainer container kubepods-burstable-pod0fe58eb067703b4a76989df7dbb8bb53.slice. Mar 7 00:55:13.865430 kubelet[2862]: E0307 00:55:13.864171 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:13.870571 kubelet[2862]: E0307 00:55:13.870504 2862 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-166?timeout=10s\": dial tcp 172.31.23.166:6443: connect: connection refused" interval="400ms" Mar 7 00:55:13.871504 systemd[1]: Created slice kubepods-burstable-podbdaecd9f1da9889a01cd682cb5db02fc.slice - libcontainer container kubepods-burstable-podbdaecd9f1da9889a01cd682cb5db02fc.slice. Mar 7 00:55:13.874803 kubelet[2862]: I0307 00:55:13.872547 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-ca-certs\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:13.874803 kubelet[2862]: I0307 00:55:13.872634 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:13.874803 kubelet[2862]: I0307 00:55:13.872741 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-k8s-certs\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:13.874803 kubelet[2862]: I0307 00:55:13.872809 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-ca-certs\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:13.874803 kubelet[2862]: I0307 00:55:13.872851 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-k8s-certs\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:13.875188 kubelet[2862]: I0307 00:55:13.872922 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:13.875188 kubelet[2862]: I0307 00:55:13.872986 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-kubeconfig\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:13.875188 kubelet[2862]: I0307 00:55:13.873059 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:13.875188 kubelet[2862]: I0307 00:55:13.873102 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4985758ef145dc95dec95e0dc1196ae-kubeconfig\") pod \"kube-scheduler-ip-172-31-23-166\" (UID: \"d4985758ef145dc95dec95e0dc1196ae\") " pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:13.878903 kubelet[2862]: I0307 00:55:13.878863 2862 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:13.881309 kubelet[2862]: E0307 00:55:13.881257 2862 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.23.166:6443/api/v1/nodes\": dial tcp 172.31.23.166:6443: connect: connection refused" node="ip-172-31-23-166" Mar 7 00:55:13.885173 kubelet[2862]: E0307 00:55:13.884621 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:13.893329 systemd[1]: Created slice kubepods-burstable-podd4985758ef145dc95dec95e0dc1196ae.slice - libcontainer container kubepods-burstable-podd4985758ef145dc95dec95e0dc1196ae.slice. Mar 7 00:55:13.897629 kubelet[2862]: E0307 00:55:13.897570 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:14.084596 kubelet[2862]: I0307 00:55:14.084535 2862 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:14.085466 kubelet[2862]: E0307 00:55:14.085418 2862 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.23.166:6443/api/v1/nodes\": dial tcp 172.31.23.166:6443: connect: connection refused" node="ip-172-31-23-166" Mar 7 00:55:14.171188 containerd[2022]: time="2026-03-07T00:55:14.171117491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-23-166,Uid:0fe58eb067703b4a76989df7dbb8bb53,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:14.192432 containerd[2022]: time="2026-03-07T00:55:14.192288719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-23-166,Uid:bdaecd9f1da9889a01cd682cb5db02fc,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:14.204224 containerd[2022]: time="2026-03-07T00:55:14.203041547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-23-166,Uid:d4985758ef145dc95dec95e0dc1196ae,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:14.271880 kubelet[2862]: E0307 00:55:14.271821 2862 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-166?timeout=10s\": dial tcp 172.31.23.166:6443: connect: connection refused" interval="800ms" Mar 7 00:55:14.487595 kubelet[2862]: I0307 00:55:14.487449 2862 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:14.488059 kubelet[2862]: E0307 00:55:14.488005 2862 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.23.166:6443/api/v1/nodes\": dial tcp 172.31.23.166:6443: connect: connection refused" node="ip-172-31-23-166" Mar 7 00:55:14.715261 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount499068976.mount: Deactivated successfully. Mar 7 00:55:14.730486 containerd[2022]: time="2026-03-07T00:55:14.729625190Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:14.735753 containerd[2022]: time="2026-03-07T00:55:14.735599066Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 7 00:55:14.739718 containerd[2022]: time="2026-03-07T00:55:14.737776070Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:14.740025 containerd[2022]: time="2026-03-07T00:55:14.739979078Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:14.742224 containerd[2022]: time="2026-03-07T00:55:14.742178786Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:55:14.744243 containerd[2022]: time="2026-03-07T00:55:14.744180914Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:14.746679 containerd[2022]: time="2026-03-07T00:55:14.746596886Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:55:14.751610 containerd[2022]: time="2026-03-07T00:55:14.751532810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:14.754499 containerd[2022]: time="2026-03-07T00:55:14.754440818Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 583.204371ms" Mar 7 00:55:14.764421 containerd[2022]: time="2026-03-07T00:55:14.763076678Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 570.624819ms" Mar 7 00:55:14.765069 containerd[2022]: time="2026-03-07T00:55:14.765000242Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 561.837675ms" Mar 7 00:55:14.944491 containerd[2022]: time="2026-03-07T00:55:14.943888071Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:14.944491 containerd[2022]: time="2026-03-07T00:55:14.943988595Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:14.944491 containerd[2022]: time="2026-03-07T00:55:14.944043867Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:14.944491 containerd[2022]: time="2026-03-07T00:55:14.944240475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:14.952728 containerd[2022]: time="2026-03-07T00:55:14.952550679Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:14.953973 containerd[2022]: time="2026-03-07T00:55:14.953589603Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:14.953973 containerd[2022]: time="2026-03-07T00:55:14.953637555Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:14.953973 containerd[2022]: time="2026-03-07T00:55:14.953809731Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:14.957583 containerd[2022]: time="2026-03-07T00:55:14.957434415Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:14.957786 containerd[2022]: time="2026-03-07T00:55:14.957538959Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:14.957786 containerd[2022]: time="2026-03-07T00:55:14.957593847Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:14.957938 containerd[2022]: time="2026-03-07T00:55:14.957757599Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:15.003759 systemd[1]: Started cri-containerd-4ace87b7b33bca3c5f15f9ea6771e20e9cf08dae6441167cab6b2fca08dc8c06.scope - libcontainer container 4ace87b7b33bca3c5f15f9ea6771e20e9cf08dae6441167cab6b2fca08dc8c06. Mar 7 00:55:15.013755 systemd[1]: Started cri-containerd-7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926.scope - libcontainer container 7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926. Mar 7 00:55:15.029752 systemd[1]: Started cri-containerd-4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e.scope - libcontainer container 4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e. Mar 7 00:55:15.073144 kubelet[2862]: E0307 00:55:15.073071 2862 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-166?timeout=10s\": dial tcp 172.31.23.166:6443: connect: connection refused" interval="1.6s" Mar 7 00:55:15.120759 containerd[2022]: time="2026-03-07T00:55:15.120497136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-23-166,Uid:0fe58eb067703b4a76989df7dbb8bb53,Namespace:kube-system,Attempt:0,} returns sandbox id \"4ace87b7b33bca3c5f15f9ea6771e20e9cf08dae6441167cab6b2fca08dc8c06\"" Mar 7 00:55:15.142191 containerd[2022]: time="2026-03-07T00:55:15.141998280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-23-166,Uid:d4985758ef145dc95dec95e0dc1196ae,Namespace:kube-system,Attempt:0,} returns sandbox id \"7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926\"" Mar 7 00:55:15.142191 containerd[2022]: time="2026-03-07T00:55:15.142138596Z" level=info msg="CreateContainer within sandbox \"4ace87b7b33bca3c5f15f9ea6771e20e9cf08dae6441167cab6b2fca08dc8c06\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 00:55:15.155341 containerd[2022]: time="2026-03-07T00:55:15.155122716Z" level=info msg="CreateContainer within sandbox \"7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 00:55:15.168233 containerd[2022]: time="2026-03-07T00:55:15.168005916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-23-166,Uid:bdaecd9f1da9889a01cd682cb5db02fc,Namespace:kube-system,Attempt:0,} returns sandbox id \"4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e\"" Mar 7 00:55:15.180429 containerd[2022]: time="2026-03-07T00:55:15.180346596Z" level=info msg="CreateContainer within sandbox \"4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 00:55:15.192727 containerd[2022]: time="2026-03-07T00:55:15.192656484Z" level=info msg="CreateContainer within sandbox \"7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575\"" Mar 7 00:55:15.194049 containerd[2022]: time="2026-03-07T00:55:15.193763688Z" level=info msg="StartContainer for \"4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575\"" Mar 7 00:55:15.196600 containerd[2022]: time="2026-03-07T00:55:15.196545204Z" level=info msg="CreateContainer within sandbox \"4ace87b7b33bca3c5f15f9ea6771e20e9cf08dae6441167cab6b2fca08dc8c06\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"409fed62a4d33fe615db4a479d3481c4c8ef74ad7f068718c9a94bceaae0ee9e\"" Mar 7 00:55:15.199899 containerd[2022]: time="2026-03-07T00:55:15.199804908Z" level=info msg="StartContainer for \"409fed62a4d33fe615db4a479d3481c4c8ef74ad7f068718c9a94bceaae0ee9e\"" Mar 7 00:55:15.234610 containerd[2022]: time="2026-03-07T00:55:15.234418536Z" level=info msg="CreateContainer within sandbox \"4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be\"" Mar 7 00:55:15.236413 containerd[2022]: time="2026-03-07T00:55:15.235920768Z" level=info msg="StartContainer for \"d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be\"" Mar 7 00:55:15.264986 systemd[1]: Started cri-containerd-4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575.scope - libcontainer container 4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575. Mar 7 00:55:15.278265 systemd[1]: Started cri-containerd-409fed62a4d33fe615db4a479d3481c4c8ef74ad7f068718c9a94bceaae0ee9e.scope - libcontainer container 409fed62a4d33fe615db4a479d3481c4c8ef74ad7f068718c9a94bceaae0ee9e. Mar 7 00:55:15.292166 kubelet[2862]: I0307 00:55:15.291731 2862 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:15.293185 kubelet[2862]: E0307 00:55:15.292874 2862 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.23.166:6443/api/v1/nodes\": dial tcp 172.31.23.166:6443: connect: connection refused" node="ip-172-31-23-166" Mar 7 00:55:15.327692 systemd[1]: Started cri-containerd-d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be.scope - libcontainer container d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be. Mar 7 00:55:15.377933 containerd[2022]: time="2026-03-07T00:55:15.377703793Z" level=info msg="StartContainer for \"4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575\" returns successfully" Mar 7 00:55:15.436198 containerd[2022]: time="2026-03-07T00:55:15.435887749Z" level=info msg="StartContainer for \"409fed62a4d33fe615db4a479d3481c4c8ef74ad7f068718c9a94bceaae0ee9e\" returns successfully" Mar 7 00:55:15.496646 containerd[2022]: time="2026-03-07T00:55:15.496533266Z" level=info msg="StartContainer for \"d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be\" returns successfully" Mar 7 00:55:15.739236 kubelet[2862]: E0307 00:55:15.739180 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:15.744064 kubelet[2862]: E0307 00:55:15.743791 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:15.750295 kubelet[2862]: E0307 00:55:15.749927 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:16.753058 kubelet[2862]: E0307 00:55:16.753001 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:16.753727 kubelet[2862]: E0307 00:55:16.753679 2862 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:16.896726 kubelet[2862]: I0307 00:55:16.896668 2862 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:18.464772 kubelet[2862]: E0307 00:55:18.464699 2862 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-23-166\" not found" node="ip-172-31-23-166" Mar 7 00:55:18.580541 kubelet[2862]: I0307 00:55:18.580482 2862 kubelet_node_status.go:77] "Successfully registered node" node="ip-172-31-23-166" Mar 7 00:55:18.636354 kubelet[2862]: I0307 00:55:18.636003 2862 apiserver.go:52] "Watching apiserver" Mar 7 00:55:18.670839 kubelet[2862]: I0307 00:55:18.670714 2862 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:55:18.670839 kubelet[2862]: I0307 00:55:18.670804 2862 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:18.684516 kubelet[2862]: E0307 00:55:18.684152 2862 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-23-166\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:18.684516 kubelet[2862]: I0307 00:55:18.684197 2862 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:18.688938 kubelet[2862]: E0307 00:55:18.688788 2862 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-23-166\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:18.688938 kubelet[2862]: I0307 00:55:18.688832 2862 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:18.692284 kubelet[2862]: E0307 00:55:18.692235 2862 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-23-166\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:18.926815 kubelet[2862]: I0307 00:55:18.926427 2862 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:18.932262 kubelet[2862]: E0307 00:55:18.932209 2862 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-23-166\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:20.145457 kubelet[2862]: I0307 00:55:20.144273 2862 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:20.817778 systemd[1]: Reloading requested from client PID 3152 ('systemctl') (unit session-7.scope)... Mar 7 00:55:20.817811 systemd[1]: Reloading... Mar 7 00:55:21.008548 zram_generator::config[3195]: No configuration found. Mar 7 00:55:21.313071 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:55:21.527255 systemd[1]: Reloading finished in 708 ms. Mar 7 00:55:21.613268 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:21.627833 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 00:55:21.628580 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:21.628832 systemd[1]: kubelet.service: Consumed 1.536s CPU time, 123.1M memory peak, 0B memory swap peak. Mar 7 00:55:21.639000 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:21.990753 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:21.999125 (kubelet)[3252]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:55:22.106902 kubelet[3252]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:55:22.121980 kubelet[3252]: I0307 00:55:22.121878 3252 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 00:55:22.121980 kubelet[3252]: I0307 00:55:22.121954 3252 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:55:22.122180 kubelet[3252]: I0307 00:55:22.122000 3252 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:55:22.122180 kubelet[3252]: I0307 00:55:22.122013 3252 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:55:22.123489 kubelet[3252]: I0307 00:55:22.122928 3252 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 00:55:22.131815 kubelet[3252]: I0307 00:55:22.128360 3252 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 00:55:22.139636 kubelet[3252]: I0307 00:55:22.139576 3252 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:55:22.149343 kubelet[3252]: E0307 00:55:22.149279 3252 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:55:22.149556 kubelet[3252]: I0307 00:55:22.149461 3252 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 00:55:22.155669 kubelet[3252]: I0307 00:55:22.155603 3252 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:55:22.156345 kubelet[3252]: I0307 00:55:22.156289 3252 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:55:22.157064 kubelet[3252]: I0307 00:55:22.156615 3252 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-23-166","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:55:22.157378 kubelet[3252]: I0307 00:55:22.157350 3252 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 00:55:22.157525 kubelet[3252]: I0307 00:55:22.157506 3252 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 00:55:22.157748 kubelet[3252]: I0307 00:55:22.157713 3252 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:55:22.158283 kubelet[3252]: I0307 00:55:22.158259 3252 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 00:55:22.158931 kubelet[3252]: I0307 00:55:22.158751 3252 kubelet.go:482] "Attempting to sync node with API server" Mar 7 00:55:22.158931 kubelet[3252]: I0307 00:55:22.158794 3252 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:55:22.158931 kubelet[3252]: I0307 00:55:22.158827 3252 kubelet.go:394] "Adding apiserver pod source" Mar 7 00:55:22.158931 kubelet[3252]: I0307 00:55:22.158846 3252 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:55:22.171421 kubelet[3252]: I0307 00:55:22.168750 3252 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:55:22.174214 kubelet[3252]: I0307 00:55:22.172284 3252 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:55:22.174214 kubelet[3252]: I0307 00:55:22.172365 3252 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:55:22.187428 kubelet[3252]: I0307 00:55:22.186533 3252 server.go:1257] "Started kubelet" Mar 7 00:55:22.194804 kubelet[3252]: I0307 00:55:22.194757 3252 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 00:55:22.235467 kubelet[3252]: I0307 00:55:22.196188 3252 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:55:22.245562 kubelet[3252]: I0307 00:55:22.196292 3252 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:55:22.245845 kubelet[3252]: I0307 00:55:22.245809 3252 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:55:22.246230 kubelet[3252]: I0307 00:55:22.246201 3252 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:55:22.268424 kubelet[3252]: I0307 00:55:22.235290 3252 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:55:22.268424 kubelet[3252]: I0307 00:55:22.235231 3252 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 00:55:22.269109 kubelet[3252]: I0307 00:55:22.208977 3252 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:55:22.270526 kubelet[3252]: I0307 00:55:22.270472 3252 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:55:22.270801 kubelet[3252]: I0307 00:55:22.253464 3252 server.go:317] "Adding debug handlers to kubelet server" Mar 7 00:55:22.274627 kubelet[3252]: I0307 00:55:22.274560 3252 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:55:22.274825 kubelet[3252]: I0307 00:55:22.274798 3252 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:55:22.297346 kubelet[3252]: I0307 00:55:22.295464 3252 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:55:22.314426 kubelet[3252]: E0307 00:55:22.313339 3252 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:55:22.329292 kubelet[3252]: I0307 00:55:22.329211 3252 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:55:22.340822 kubelet[3252]: I0307 00:55:22.340186 3252 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:55:22.340822 kubelet[3252]: I0307 00:55:22.340235 3252 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 00:55:22.340822 kubelet[3252]: I0307 00:55:22.340270 3252 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 00:55:22.340822 kubelet[3252]: E0307 00:55:22.340332 3252 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:55:22.440264 kubelet[3252]: I0307 00:55:22.440202 3252 cpu_manager.go:225] "Starting" policy="none" Mar 7 00:55:22.440264 kubelet[3252]: I0307 00:55:22.440237 3252 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440276 3252 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440541 3252 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440565 3252 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440598 3252 policy_none.go:50] "Start" Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440614 3252 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:55:22.440705 kubelet[3252]: I0307 00:55:22.440635 3252 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:55:22.441705 kubelet[3252]: I0307 00:55:22.440815 3252 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 7 00:55:22.441705 kubelet[3252]: I0307 00:55:22.440842 3252 policy_none.go:44] "Start" Mar 7 00:55:22.441705 kubelet[3252]: E0307 00:55:22.441523 3252 kubelet.go:2525] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 7 00:55:22.457513 kubelet[3252]: E0307 00:55:22.457223 3252 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:55:22.458083 kubelet[3252]: I0307 00:55:22.457564 3252 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 00:55:22.458083 kubelet[3252]: I0307 00:55:22.457584 3252 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:55:22.462308 kubelet[3252]: I0307 00:55:22.462259 3252 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 00:55:22.465137 kubelet[3252]: E0307 00:55:22.465068 3252 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:55:22.590926 kubelet[3252]: I0307 00:55:22.590781 3252 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-23-166" Mar 7 00:55:22.609771 kubelet[3252]: I0307 00:55:22.609717 3252 kubelet_node_status.go:123] "Node was previously registered" node="ip-172-31-23-166" Mar 7 00:55:22.610042 kubelet[3252]: I0307 00:55:22.609838 3252 kubelet_node_status.go:77] "Successfully registered node" node="ip-172-31-23-166" Mar 7 00:55:22.643602 kubelet[3252]: I0307 00:55:22.643536 3252 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:22.644605 kubelet[3252]: I0307 00:55:22.644235 3252 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:22.646999 kubelet[3252]: I0307 00:55:22.646890 3252 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:22.671791 kubelet[3252]: E0307 00:55:22.671693 3252 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-23-166\" already exists" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:22.673272 kubelet[3252]: I0307 00:55:22.673163 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:22.673272 kubelet[3252]: I0307 00:55:22.673248 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:22.673541 kubelet[3252]: I0307 00:55:22.673294 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-k8s-certs\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:22.673541 kubelet[3252]: I0307 00:55:22.673333 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:22.673541 kubelet[3252]: I0307 00:55:22.673373 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4985758ef145dc95dec95e0dc1196ae-kubeconfig\") pod \"kube-scheduler-ip-172-31-23-166\" (UID: \"d4985758ef145dc95dec95e0dc1196ae\") " pod="kube-system/kube-scheduler-ip-172-31-23-166" Mar 7 00:55:22.673541 kubelet[3252]: I0307 00:55:22.673481 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-ca-certs\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:22.673762 kubelet[3252]: I0307 00:55:22.673541 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0fe58eb067703b4a76989df7dbb8bb53-k8s-certs\") pod \"kube-apiserver-ip-172-31-23-166\" (UID: \"0fe58eb067703b4a76989df7dbb8bb53\") " pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:22.673762 kubelet[3252]: I0307 00:55:22.673586 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-ca-certs\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:22.673762 kubelet[3252]: I0307 00:55:22.673627 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bdaecd9f1da9889a01cd682cb5db02fc-kubeconfig\") pod \"kube-controller-manager-ip-172-31-23-166\" (UID: \"bdaecd9f1da9889a01cd682cb5db02fc\") " pod="kube-system/kube-controller-manager-ip-172-31-23-166" Mar 7 00:55:23.164880 kubelet[3252]: I0307 00:55:23.164752 3252 apiserver.go:52] "Watching apiserver" Mar 7 00:55:23.227873 kubelet[3252]: I0307 00:55:23.227766 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-23-166" podStartSLOduration=1.227722988 podStartE2EDuration="1.227722988s" podCreationTimestamp="2026-03-07 00:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:55:23.22645328 +0000 UTC m=+1.219667947" watchObservedRunningTime="2026-03-07 00:55:23.227722988 +0000 UTC m=+1.220937619" Mar 7 00:55:23.262092 kubelet[3252]: I0307 00:55:23.261987 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-23-166" podStartSLOduration=3.261967628 podStartE2EDuration="3.261967628s" podCreationTimestamp="2026-03-07 00:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:55:23.243650204 +0000 UTC m=+1.236864847" watchObservedRunningTime="2026-03-07 00:55:23.261967628 +0000 UTC m=+1.255182271" Mar 7 00:55:23.268585 kubelet[3252]: I0307 00:55:23.268507 3252 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:55:23.399075 kubelet[3252]: I0307 00:55:23.398176 3252 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:23.412902 kubelet[3252]: E0307 00:55:23.412123 3252 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-23-166\" already exists" pod="kube-system/kube-apiserver-ip-172-31-23-166" Mar 7 00:55:23.418138 kubelet[3252]: I0307 00:55:23.417908 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-23-166" podStartSLOduration=1.4178886689999999 podStartE2EDuration="1.417888669s" podCreationTimestamp="2026-03-07 00:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:55:23.263015612 +0000 UTC m=+1.256230291" watchObservedRunningTime="2026-03-07 00:55:23.417888669 +0000 UTC m=+1.411103336" Mar 7 00:55:26.109268 kubelet[3252]: I0307 00:55:26.109006 3252 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 00:55:26.112135 containerd[2022]: time="2026-03-07T00:55:26.110273170Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 00:55:26.113584 kubelet[3252]: I0307 00:55:26.110643 3252 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 00:55:26.735364 update_engine[2005]: I20260307 00:55:26.733516 2005 update_attempter.cc:509] Updating boot flags... Mar 7 00:55:26.854592 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (3320) Mar 7 00:55:27.209522 kubelet[3252]: I0307 00:55:27.209432 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/294479e9-f041-406a-b782-9d18052c5cb3-kube-proxy\") pod \"kube-proxy-mh6k7\" (UID: \"294479e9-f041-406a-b782-9d18052c5cb3\") " pod="kube-system/kube-proxy-mh6k7" Mar 7 00:55:27.212092 kubelet[3252]: I0307 00:55:27.209535 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/294479e9-f041-406a-b782-9d18052c5cb3-xtables-lock\") pod \"kube-proxy-mh6k7\" (UID: \"294479e9-f041-406a-b782-9d18052c5cb3\") " pod="kube-system/kube-proxy-mh6k7" Mar 7 00:55:27.212092 kubelet[3252]: I0307 00:55:27.209588 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/294479e9-f041-406a-b782-9d18052c5cb3-lib-modules\") pod \"kube-proxy-mh6k7\" (UID: \"294479e9-f041-406a-b782-9d18052c5cb3\") " pod="kube-system/kube-proxy-mh6k7" Mar 7 00:55:27.212092 kubelet[3252]: I0307 00:55:27.209626 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6sl\" (UniqueName: \"kubernetes.io/projected/294479e9-f041-406a-b782-9d18052c5cb3-kube-api-access-pm6sl\") pod \"kube-proxy-mh6k7\" (UID: \"294479e9-f041-406a-b782-9d18052c5cb3\") " pod="kube-system/kube-proxy-mh6k7" Mar 7 00:55:27.253899 systemd[1]: Created slice kubepods-besteffort-pod294479e9_f041_406a_b782_9d18052c5cb3.slice - libcontainer container kubepods-besteffort-pod294479e9_f041_406a_b782_9d18052c5cb3.slice. Mar 7 00:55:27.404435 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (3324) Mar 7 00:55:27.527462 systemd[1]: Created slice kubepods-besteffort-poddea686a5_e68f_4a2a_8c37_d728beae3dc9.slice - libcontainer container kubepods-besteffort-poddea686a5_e68f_4a2a_8c37_d728beae3dc9.slice. Mar 7 00:55:27.594474 containerd[2022]: time="2026-03-07T00:55:27.593929322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mh6k7,Uid:294479e9-f041-406a-b782-9d18052c5cb3,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:27.615961 kubelet[3252]: I0307 00:55:27.613308 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dea686a5-e68f-4a2a-8c37-d728beae3dc9-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-wjjbs\" (UID: \"dea686a5-e68f-4a2a-8c37-d728beae3dc9\") " pod="tigera-operator/tigera-operator-6cf4cccc57-wjjbs" Mar 7 00:55:27.615961 kubelet[3252]: I0307 00:55:27.613464 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsphz\" (UniqueName: \"kubernetes.io/projected/dea686a5-e68f-4a2a-8c37-d728beae3dc9-kube-api-access-jsphz\") pod \"tigera-operator-6cf4cccc57-wjjbs\" (UID: \"dea686a5-e68f-4a2a-8c37-d728beae3dc9\") " pod="tigera-operator/tigera-operator-6cf4cccc57-wjjbs" Mar 7 00:55:27.709188 containerd[2022]: time="2026-03-07T00:55:27.708111818Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:27.709816 containerd[2022]: time="2026-03-07T00:55:27.709164770Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:27.709816 containerd[2022]: time="2026-03-07T00:55:27.709219070Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:27.709816 containerd[2022]: time="2026-03-07T00:55:27.709420958Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:27.837650 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (3324) Mar 7 00:55:27.842046 systemd[1]: Started cri-containerd-520bddabd3a257aabaf867b606cd0ce2ba6690bb67a9c1d3940871d5e2a0d1f6.scope - libcontainer container 520bddabd3a257aabaf867b606cd0ce2ba6690bb67a9c1d3940871d5e2a0d1f6. Mar 7 00:55:27.845352 containerd[2022]: time="2026-03-07T00:55:27.845296539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-wjjbs,Uid:dea686a5-e68f-4a2a-8c37-d728beae3dc9,Namespace:tigera-operator,Attempt:0,}" Mar 7 00:55:27.955668 containerd[2022]: time="2026-03-07T00:55:27.955602400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mh6k7,Uid:294479e9-f041-406a-b782-9d18052c5cb3,Namespace:kube-system,Attempt:0,} returns sandbox id \"520bddabd3a257aabaf867b606cd0ce2ba6690bb67a9c1d3940871d5e2a0d1f6\"" Mar 7 00:55:27.975924 containerd[2022]: time="2026-03-07T00:55:27.975862588Z" level=info msg="CreateContainer within sandbox \"520bddabd3a257aabaf867b606cd0ce2ba6690bb67a9c1d3940871d5e2a0d1f6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 00:55:27.997294 containerd[2022]: time="2026-03-07T00:55:27.997083496Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:27.997294 containerd[2022]: time="2026-03-07T00:55:27.997204252Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:27.997294 containerd[2022]: time="2026-03-07T00:55:27.997242604Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:27.997789 containerd[2022]: time="2026-03-07T00:55:27.997415680Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:28.042024 containerd[2022]: time="2026-03-07T00:55:28.040586844Z" level=info msg="CreateContainer within sandbox \"520bddabd3a257aabaf867b606cd0ce2ba6690bb67a9c1d3940871d5e2a0d1f6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e46d261705aee9f1274b1ba557d1d09298cd2050b45076963941f104e477817c\"" Mar 7 00:55:28.046102 containerd[2022]: time="2026-03-07T00:55:28.043199700Z" level=info msg="StartContainer for \"e46d261705aee9f1274b1ba557d1d09298cd2050b45076963941f104e477817c\"" Mar 7 00:55:28.082418 systemd[1]: Started cri-containerd-e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310.scope - libcontainer container e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310. Mar 7 00:55:28.254897 systemd[1]: Started cri-containerd-e46d261705aee9f1274b1ba557d1d09298cd2050b45076963941f104e477817c.scope - libcontainer container e46d261705aee9f1274b1ba557d1d09298cd2050b45076963941f104e477817c. Mar 7 00:55:28.319100 containerd[2022]: time="2026-03-07T00:55:28.318672397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-wjjbs,Uid:dea686a5-e68f-4a2a-8c37-d728beae3dc9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310\"" Mar 7 00:55:28.325292 containerd[2022]: time="2026-03-07T00:55:28.324022189Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 00:55:28.344550 containerd[2022]: time="2026-03-07T00:55:28.344432294Z" level=info msg="StartContainer for \"e46d261705aee9f1274b1ba557d1d09298cd2050b45076963941f104e477817c\" returns successfully" Mar 7 00:55:29.543445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2096993092.mount: Deactivated successfully. Mar 7 00:55:30.210942 kubelet[3252]: I0307 00:55:30.210804 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-mh6k7" podStartSLOduration=3.210778899 podStartE2EDuration="3.210778899s" podCreationTimestamp="2026-03-07 00:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:55:28.445287062 +0000 UTC m=+6.438501729" watchObservedRunningTime="2026-03-07 00:55:30.210778899 +0000 UTC m=+8.203993566" Mar 7 00:55:31.109722 containerd[2022]: time="2026-03-07T00:55:31.109634007Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:31.111914 containerd[2022]: time="2026-03-07T00:55:31.111502971Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 7 00:55:31.114561 containerd[2022]: time="2026-03-07T00:55:31.113979579Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:31.121104 containerd[2022]: time="2026-03-07T00:55:31.120439863Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:31.123401 containerd[2022]: time="2026-03-07T00:55:31.122222955Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.798128742s" Mar 7 00:55:31.123401 containerd[2022]: time="2026-03-07T00:55:31.122291979Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 7 00:55:31.134869 containerd[2022]: time="2026-03-07T00:55:31.134815851Z" level=info msg="CreateContainer within sandbox \"e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 00:55:31.163117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4154065865.mount: Deactivated successfully. Mar 7 00:55:31.164639 containerd[2022]: time="2026-03-07T00:55:31.164095432Z" level=info msg="CreateContainer within sandbox \"e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2\"" Mar 7 00:55:31.166884 containerd[2022]: time="2026-03-07T00:55:31.165529552Z" level=info msg="StartContainer for \"802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2\"" Mar 7 00:55:31.222349 systemd[1]: run-containerd-runc-k8s.io-802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2-runc.pZh7UA.mount: Deactivated successfully. Mar 7 00:55:31.233737 systemd[1]: Started cri-containerd-802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2.scope - libcontainer container 802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2. Mar 7 00:55:31.289413 containerd[2022]: time="2026-03-07T00:55:31.289191688Z" level=info msg="StartContainer for \"802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2\" returns successfully" Mar 7 00:55:31.515949 systemd[1]: Started sshd@7-172.31.23.166:22-180.76.245.131:34064.service - OpenSSH per-connection server daemon (180.76.245.131:34064). Mar 7 00:55:31.546681 sshd[3874]: Connection closed by 180.76.245.131 port 34064 Mar 7 00:55:31.547942 systemd[1]: sshd@7-172.31.23.166:22-180.76.245.131:34064.service: Deactivated successfully. Mar 7 00:55:34.487223 kubelet[3252]: I0307 00:55:34.487122 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-wjjbs" podStartSLOduration=4.685009506 podStartE2EDuration="7.487103108s" podCreationTimestamp="2026-03-07 00:55:27 +0000 UTC" firstStartedPulling="2026-03-07 00:55:28.322604809 +0000 UTC m=+6.315819440" lastFinishedPulling="2026-03-07 00:55:31.124698399 +0000 UTC m=+9.117913042" observedRunningTime="2026-03-07 00:55:31.451588433 +0000 UTC m=+9.444803088" watchObservedRunningTime="2026-03-07 00:55:34.487103108 +0000 UTC m=+12.480317763" Mar 7 00:55:39.819714 sudo[2344]: pam_unix(sudo:session): session closed for user root Mar 7 00:55:39.900770 sshd[2341]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:39.910059 systemd[1]: sshd@6-172.31.23.166:22-20.161.92.111:55640.service: Deactivated successfully. Mar 7 00:55:39.918175 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 00:55:39.921233 systemd[1]: session-7.scope: Consumed 8.406s CPU time, 148.0M memory peak, 0B memory swap peak. Mar 7 00:55:39.928712 systemd-logind[2004]: Session 7 logged out. Waiting for processes to exit. Mar 7 00:55:39.934926 systemd-logind[2004]: Removed session 7. Mar 7 00:55:50.204566 systemd[1]: Created slice kubepods-besteffort-podd55b9b8d_593f_49da_8710_4be00328664e.slice - libcontainer container kubepods-besteffort-podd55b9b8d_593f_49da_8710_4be00328664e.slice. Mar 7 00:55:50.263710 kubelet[3252]: I0307 00:55:50.263640 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d55b9b8d-593f-49da-8710-4be00328664e-typha-certs\") pod \"calico-typha-77867d5cb5-xgfn5\" (UID: \"d55b9b8d-593f-49da-8710-4be00328664e\") " pod="calico-system/calico-typha-77867d5cb5-xgfn5" Mar 7 00:55:50.265675 kubelet[3252]: I0307 00:55:50.265512 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d55b9b8d-593f-49da-8710-4be00328664e-tigera-ca-bundle\") pod \"calico-typha-77867d5cb5-xgfn5\" (UID: \"d55b9b8d-593f-49da-8710-4be00328664e\") " pod="calico-system/calico-typha-77867d5cb5-xgfn5" Mar 7 00:55:50.265675 kubelet[3252]: I0307 00:55:50.265583 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lqn6\" (UniqueName: \"kubernetes.io/projected/d55b9b8d-593f-49da-8710-4be00328664e-kube-api-access-9lqn6\") pod \"calico-typha-77867d5cb5-xgfn5\" (UID: \"d55b9b8d-593f-49da-8710-4be00328664e\") " pod="calico-system/calico-typha-77867d5cb5-xgfn5" Mar 7 00:55:50.539986 systemd[1]: Created slice kubepods-besteffort-pod7c9835cd_e1c2_42a0_987e_4dae6163ebe2.slice - libcontainer container kubepods-besteffort-pod7c9835cd_e1c2_42a0_987e_4dae6163ebe2.slice. Mar 7 00:55:50.554769 containerd[2022]: time="2026-03-07T00:55:50.554101032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77867d5cb5-xgfn5,Uid:d55b9b8d-593f-49da-8710-4be00328664e,Namespace:calico-system,Attempt:0,}" Mar 7 00:55:50.568743 kubelet[3252]: I0307 00:55:50.568416 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-bpffs\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.568743 kubelet[3252]: I0307 00:55:50.568499 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-cni-net-dir\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.568743 kubelet[3252]: I0307 00:55:50.568543 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-var-lib-calico\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.568743 kubelet[3252]: I0307 00:55:50.568601 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2p26\" (UniqueName: \"kubernetes.io/projected/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-kube-api-access-g2p26\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.568743 kubelet[3252]: I0307 00:55:50.568645 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-flexvol-driver-host\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569135 kubelet[3252]: I0307 00:55:50.568686 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-node-certs\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569135 kubelet[3252]: I0307 00:55:50.568737 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-xtables-lock\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569135 kubelet[3252]: I0307 00:55:50.568810 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-cni-log-dir\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569135 kubelet[3252]: I0307 00:55:50.568868 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-nodeproc\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569135 kubelet[3252]: I0307 00:55:50.568924 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-lib-modules\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569431 kubelet[3252]: I0307 00:55:50.568963 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-policysync\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569431 kubelet[3252]: I0307 00:55:50.569015 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-sys-fs\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569431 kubelet[3252]: I0307 00:55:50.569063 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-cni-bin-dir\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569431 kubelet[3252]: I0307 00:55:50.569138 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-tigera-ca-bundle\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.569431 kubelet[3252]: I0307 00:55:50.569181 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7c9835cd-e1c2-42a0-987e-4dae6163ebe2-var-run-calico\") pod \"calico-node-kv87x\" (UID: \"7c9835cd-e1c2-42a0-987e-4dae6163ebe2\") " pod="calico-system/calico-node-kv87x" Mar 7 00:55:50.628071 containerd[2022]: time="2026-03-07T00:55:50.627883320Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:50.628071 containerd[2022]: time="2026-03-07T00:55:50.627979176Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:50.628071 containerd[2022]: time="2026-03-07T00:55:50.628019292Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:50.628586 containerd[2022]: time="2026-03-07T00:55:50.628193028Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:50.704230 systemd[1]: Started cri-containerd-c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4.scope - libcontainer container c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4. Mar 7 00:55:50.708710 kubelet[3252]: E0307 00:55:50.708655 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.708925 kubelet[3252]: W0307 00:55:50.708890 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.709103 kubelet[3252]: E0307 00:55:50.709076 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.720871 kubelet[3252]: E0307 00:55:50.720811 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.720871 kubelet[3252]: W0307 00:55:50.720853 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.721094 kubelet[3252]: E0307 00:55:50.720891 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.753343 kubelet[3252]: E0307 00:55:50.752421 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:55:50.754696 kubelet[3252]: E0307 00:55:50.754515 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.754696 kubelet[3252]: W0307 00:55:50.754550 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.755562 kubelet[3252]: E0307 00:55:50.754974 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.756958 kubelet[3252]: E0307 00:55:50.756821 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.756958 kubelet[3252]: W0307 00:55:50.756890 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.757688 kubelet[3252]: E0307 00:55:50.756928 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.760292 kubelet[3252]: E0307 00:55:50.760023 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.760292 kubelet[3252]: W0307 00:55:50.760057 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.760292 kubelet[3252]: E0307 00:55:50.760089 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.761400 kubelet[3252]: E0307 00:55:50.761325 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.762479 kubelet[3252]: W0307 00:55:50.761357 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.762479 kubelet[3252]: E0307 00:55:50.761703 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.763584 kubelet[3252]: E0307 00:55:50.763261 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.763584 kubelet[3252]: W0307 00:55:50.763298 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.763584 kubelet[3252]: E0307 00:55:50.763332 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.765850 kubelet[3252]: E0307 00:55:50.765640 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.765850 kubelet[3252]: W0307 00:55:50.765699 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.765850 kubelet[3252]: E0307 00:55:50.765734 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.766904 kubelet[3252]: E0307 00:55:50.766655 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.766904 kubelet[3252]: W0307 00:55:50.766688 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.766904 kubelet[3252]: E0307 00:55:50.766718 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.767359 kubelet[3252]: E0307 00:55:50.767276 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.767359 kubelet[3252]: W0307 00:55:50.767302 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.767359 kubelet[3252]: E0307 00:55:50.767328 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.768895 kubelet[3252]: E0307 00:55:50.768563 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.768895 kubelet[3252]: W0307 00:55:50.768597 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.768895 kubelet[3252]: E0307 00:55:50.768629 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.770890 kubelet[3252]: E0307 00:55:50.770669 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.770890 kubelet[3252]: W0307 00:55:50.770726 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.770890 kubelet[3252]: E0307 00:55:50.770764 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.772609 kubelet[3252]: E0307 00:55:50.772013 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.772609 kubelet[3252]: W0307 00:55:50.772047 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.772609 kubelet[3252]: E0307 00:55:50.772081 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.774523 kubelet[3252]: E0307 00:55:50.773983 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.774523 kubelet[3252]: W0307 00:55:50.774052 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.774523 kubelet[3252]: E0307 00:55:50.774088 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.776902 kubelet[3252]: E0307 00:55:50.775768 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.776902 kubelet[3252]: W0307 00:55:50.775802 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.776902 kubelet[3252]: E0307 00:55:50.775836 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.777590 kubelet[3252]: E0307 00:55:50.777299 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.777590 kubelet[3252]: W0307 00:55:50.777333 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.777590 kubelet[3252]: E0307 00:55:50.777367 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.778421 kubelet[3252]: E0307 00:55:50.778230 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.779350 kubelet[3252]: W0307 00:55:50.778734 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.779350 kubelet[3252]: E0307 00:55:50.778786 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.780243 kubelet[3252]: E0307 00:55:50.780123 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.781622 kubelet[3252]: W0307 00:55:50.781337 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.781622 kubelet[3252]: E0307 00:55:50.781412 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.783722 kubelet[3252]: E0307 00:55:50.783015 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.783722 kubelet[3252]: W0307 00:55:50.783049 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.783722 kubelet[3252]: E0307 00:55:50.783165 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.785892 kubelet[3252]: E0307 00:55:50.785155 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.785892 kubelet[3252]: W0307 00:55:50.785189 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.785892 kubelet[3252]: E0307 00:55:50.785224 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.787984 kubelet[3252]: E0307 00:55:50.787733 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.787984 kubelet[3252]: W0307 00:55:50.787766 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.787984 kubelet[3252]: E0307 00:55:50.787825 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.789196 kubelet[3252]: E0307 00:55:50.788913 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.789196 kubelet[3252]: W0307 00:55:50.788971 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.789196 kubelet[3252]: E0307 00:55:50.789007 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.790174 kubelet[3252]: E0307 00:55:50.790085 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.790174 kubelet[3252]: W0307 00:55:50.790121 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.791935 kubelet[3252]: E0307 00:55:50.790198 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.791935 kubelet[3252]: I0307 00:55:50.790255 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c78279c1-73d2-4b1a-bfd7-356a67561cbc-varrun\") pod \"csi-node-driver-tgsjl\" (UID: \"c78279c1-73d2-4b1a-bfd7-356a67561cbc\") " pod="calico-system/csi-node-driver-tgsjl" Mar 7 00:55:50.791935 kubelet[3252]: E0307 00:55:50.791330 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.791935 kubelet[3252]: W0307 00:55:50.791360 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.791935 kubelet[3252]: E0307 00:55:50.791613 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.791935 kubelet[3252]: I0307 00:55:50.791691 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c78279c1-73d2-4b1a-bfd7-356a67561cbc-kubelet-dir\") pod \"csi-node-driver-tgsjl\" (UID: \"c78279c1-73d2-4b1a-bfd7-356a67561cbc\") " pod="calico-system/csi-node-driver-tgsjl" Mar 7 00:55:50.795242 kubelet[3252]: E0307 00:55:50.795035 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.795242 kubelet[3252]: W0307 00:55:50.795091 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.795242 kubelet[3252]: E0307 00:55:50.795127 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.799356 kubelet[3252]: E0307 00:55:50.799029 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.799356 kubelet[3252]: W0307 00:55:50.799086 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.799356 kubelet[3252]: E0307 00:55:50.799129 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.801731 kubelet[3252]: E0307 00:55:50.800415 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.801731 kubelet[3252]: W0307 00:55:50.800448 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.801731 kubelet[3252]: E0307 00:55:50.800483 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.804144 kubelet[3252]: E0307 00:55:50.803773 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.804144 kubelet[3252]: W0307 00:55:50.803812 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.804144 kubelet[3252]: E0307 00:55:50.803846 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.804798 kubelet[3252]: E0307 00:55:50.804739 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.804798 kubelet[3252]: W0307 00:55:50.804781 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.805124 kubelet[3252]: E0307 00:55:50.804817 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.806698 kubelet[3252]: E0307 00:55:50.805400 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.806698 kubelet[3252]: W0307 00:55:50.805434 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.806698 kubelet[3252]: E0307 00:55:50.805464 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.806698 kubelet[3252]: I0307 00:55:50.806453 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c78279c1-73d2-4b1a-bfd7-356a67561cbc-socket-dir\") pod \"csi-node-driver-tgsjl\" (UID: \"c78279c1-73d2-4b1a-bfd7-356a67561cbc\") " pod="calico-system/csi-node-driver-tgsjl" Mar 7 00:55:50.807562 kubelet[3252]: E0307 00:55:50.806708 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.807562 kubelet[3252]: W0307 00:55:50.807251 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.807562 kubelet[3252]: E0307 00:55:50.807288 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.809614 kubelet[3252]: E0307 00:55:50.807792 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.809614 kubelet[3252]: W0307 00:55:50.807814 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.809614 kubelet[3252]: E0307 00:55:50.807840 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.809614 kubelet[3252]: I0307 00:55:50.807875 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c78279c1-73d2-4b1a-bfd7-356a67561cbc-registration-dir\") pod \"csi-node-driver-tgsjl\" (UID: \"c78279c1-73d2-4b1a-bfd7-356a67561cbc\") " pod="calico-system/csi-node-driver-tgsjl" Mar 7 00:55:50.810334 kubelet[3252]: E0307 00:55:50.809967 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.810334 kubelet[3252]: W0307 00:55:50.810002 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.810334 kubelet[3252]: E0307 00:55:50.810056 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.810334 kubelet[3252]: I0307 00:55:50.810100 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkm4z\" (UniqueName: \"kubernetes.io/projected/c78279c1-73d2-4b1a-bfd7-356a67561cbc-kube-api-access-zkm4z\") pod \"csi-node-driver-tgsjl\" (UID: \"c78279c1-73d2-4b1a-bfd7-356a67561cbc\") " pod="calico-system/csi-node-driver-tgsjl" Mar 7 00:55:50.811953 kubelet[3252]: E0307 00:55:50.811044 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.811953 kubelet[3252]: W0307 00:55:50.811079 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.811953 kubelet[3252]: E0307 00:55:50.811114 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.815611 kubelet[3252]: E0307 00:55:50.815555 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.815611 kubelet[3252]: W0307 00:55:50.815597 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.815772 kubelet[3252]: E0307 00:55:50.815635 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.817171 kubelet[3252]: E0307 00:55:50.816094 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.817171 kubelet[3252]: W0307 00:55:50.816130 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.817171 kubelet[3252]: E0307 00:55:50.816159 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.819813 kubelet[3252]: E0307 00:55:50.818325 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.819813 kubelet[3252]: W0307 00:55:50.818411 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.819813 kubelet[3252]: E0307 00:55:50.818492 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.854189 containerd[2022]: time="2026-03-07T00:55:50.853664917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kv87x,Uid:7c9835cd-e1c2-42a0-987e-4dae6163ebe2,Namespace:calico-system,Attempt:0,}" Mar 7 00:55:50.911520 kubelet[3252]: E0307 00:55:50.911461 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.911520 kubelet[3252]: W0307 00:55:50.911502 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.911761 kubelet[3252]: E0307 00:55:50.911539 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.912967 kubelet[3252]: E0307 00:55:50.912886 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.912967 kubelet[3252]: W0307 00:55:50.912930 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.912967 kubelet[3252]: E0307 00:55:50.912967 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.915658 kubelet[3252]: E0307 00:55:50.915012 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.915658 kubelet[3252]: W0307 00:55:50.915072 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.915658 kubelet[3252]: E0307 00:55:50.915109 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.917430 kubelet[3252]: E0307 00:55:50.916658 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.917430 kubelet[3252]: W0307 00:55:50.916692 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.917430 kubelet[3252]: E0307 00:55:50.916726 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.919644 kubelet[3252]: E0307 00:55:50.918419 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.919644 kubelet[3252]: W0307 00:55:50.918602 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.919644 kubelet[3252]: E0307 00:55:50.918641 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.921699 kubelet[3252]: E0307 00:55:50.921185 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.921699 kubelet[3252]: W0307 00:55:50.921269 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.921699 kubelet[3252]: E0307 00:55:50.921330 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.925127 kubelet[3252]: E0307 00:55:50.924800 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.925127 kubelet[3252]: W0307 00:55:50.924837 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.925127 kubelet[3252]: E0307 00:55:50.924983 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.928641 kubelet[3252]: E0307 00:55:50.928108 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.928641 kubelet[3252]: W0307 00:55:50.928140 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.928641 kubelet[3252]: E0307 00:55:50.928174 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.931395 kubelet[3252]: E0307 00:55:50.931079 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.931395 kubelet[3252]: W0307 00:55:50.931138 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.931395 kubelet[3252]: E0307 00:55:50.931269 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.933200 kubelet[3252]: E0307 00:55:50.932777 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.933200 kubelet[3252]: W0307 00:55:50.933011 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.933200 kubelet[3252]: E0307 00:55:50.933054 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.937704 kubelet[3252]: E0307 00:55:50.936095 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.937704 kubelet[3252]: W0307 00:55:50.936143 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.937704 kubelet[3252]: E0307 00:55:50.936181 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.937994 containerd[2022]: time="2026-03-07T00:55:50.936753758Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:50.937994 containerd[2022]: time="2026-03-07T00:55:50.936873614Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:50.937994 containerd[2022]: time="2026-03-07T00:55:50.936901166Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:50.940058 containerd[2022]: time="2026-03-07T00:55:50.938849138Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:50.940248 kubelet[3252]: E0307 00:55:50.939060 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.940248 kubelet[3252]: W0307 00:55:50.939086 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.940248 kubelet[3252]: E0307 00:55:50.939189 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.942417 kubelet[3252]: E0307 00:55:50.941538 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.942417 kubelet[3252]: W0307 00:55:50.941571 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.942417 kubelet[3252]: E0307 00:55:50.941634 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.943276 kubelet[3252]: E0307 00:55:50.942949 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.943276 kubelet[3252]: W0307 00:55:50.943002 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.943276 kubelet[3252]: E0307 00:55:50.943036 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.947439 kubelet[3252]: E0307 00:55:50.946652 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.947439 kubelet[3252]: W0307 00:55:50.946689 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.947439 kubelet[3252]: E0307 00:55:50.946726 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.949848 kubelet[3252]: E0307 00:55:50.949552 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.949848 kubelet[3252]: W0307 00:55:50.949586 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.949848 kubelet[3252]: E0307 00:55:50.949621 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.950361 kubelet[3252]: E0307 00:55:50.950322 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.950934 kubelet[3252]: W0307 00:55:50.950662 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.950934 kubelet[3252]: E0307 00:55:50.950705 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.953734 kubelet[3252]: E0307 00:55:50.953615 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.954000 kubelet[3252]: W0307 00:55:50.953966 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.955667 kubelet[3252]: E0307 00:55:50.954114 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.955861 kubelet[3252]: E0307 00:55:50.955820 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.955926 kubelet[3252]: W0307 00:55:50.955855 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.955926 kubelet[3252]: E0307 00:55:50.955891 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.957795 kubelet[3252]: E0307 00:55:50.957742 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.957795 kubelet[3252]: W0307 00:55:50.957781 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.958022 kubelet[3252]: E0307 00:55:50.957819 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.958292 kubelet[3252]: E0307 00:55:50.958257 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.958292 kubelet[3252]: W0307 00:55:50.958285 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.958448 kubelet[3252]: E0307 00:55:50.958311 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.958890 kubelet[3252]: E0307 00:55:50.958850 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.958890 kubelet[3252]: W0307 00:55:50.958881 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.959037 kubelet[3252]: E0307 00:55:50.958910 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.963841 kubelet[3252]: E0307 00:55:50.963773 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.963841 kubelet[3252]: W0307 00:55:50.963813 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.964069 kubelet[3252]: E0307 00:55:50.963851 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.964862 kubelet[3252]: E0307 00:55:50.964800 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.964862 kubelet[3252]: W0307 00:55:50.964840 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.965066 kubelet[3252]: E0307 00:55:50.964877 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:50.966940 kubelet[3252]: E0307 00:55:50.966888 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:50.966940 kubelet[3252]: W0307 00:55:50.966926 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:50.967160 kubelet[3252]: E0307 00:55:50.966966 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:51.005727 systemd[1]: Started cri-containerd-8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490.scope - libcontainer container 8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490. Mar 7 00:55:51.013930 kubelet[3252]: E0307 00:55:51.011346 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:51.013930 kubelet[3252]: W0307 00:55:51.011879 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:51.013930 kubelet[3252]: E0307 00:55:51.011948 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:51.041545 containerd[2022]: time="2026-03-07T00:55:51.041094370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77867d5cb5-xgfn5,Uid:d55b9b8d-593f-49da-8710-4be00328664e,Namespace:calico-system,Attempt:0,} returns sandbox id \"c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4\"" Mar 7 00:55:51.046679 containerd[2022]: time="2026-03-07T00:55:51.046503058Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 00:55:51.094133 containerd[2022]: time="2026-03-07T00:55:51.094073531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kv87x,Uid:7c9835cd-e1c2-42a0-987e-4dae6163ebe2,Namespace:calico-system,Attempt:0,} returns sandbox id \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\"" Mar 7 00:55:51.388938 systemd[1]: run-containerd-runc-k8s.io-c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4-runc.Q91DZw.mount: Deactivated successfully. Mar 7 00:55:52.344901 kubelet[3252]: E0307 00:55:52.344822 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:55:52.384081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3231451999.mount: Deactivated successfully. Mar 7 00:55:53.442481 containerd[2022]: time="2026-03-07T00:55:53.441437354Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:53.444681 containerd[2022]: time="2026-03-07T00:55:53.444327818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 7 00:55:53.447142 containerd[2022]: time="2026-03-07T00:55:53.446853362Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:53.452507 containerd[2022]: time="2026-03-07T00:55:53.451954754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:53.453503 containerd[2022]: time="2026-03-07T00:55:53.453440114Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.40684246s" Mar 7 00:55:53.453631 containerd[2022]: time="2026-03-07T00:55:53.453502934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 7 00:55:53.458676 containerd[2022]: time="2026-03-07T00:55:53.458606030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 00:55:53.503365 containerd[2022]: time="2026-03-07T00:55:53.503298615Z" level=info msg="CreateContainer within sandbox \"c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 00:55:53.533271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3586462659.mount: Deactivated successfully. Mar 7 00:55:53.538894 containerd[2022]: time="2026-03-07T00:55:53.538807959Z" level=info msg="CreateContainer within sandbox \"c2ac0bdca421ee5b8dea25364f536b6bee46df826ad5bf81ec67dcc04eba50d4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0a77b9cd42664656e83c2e32e52c19e4eb3562c5182ffe52609c8bc179b94511\"" Mar 7 00:55:53.540884 containerd[2022]: time="2026-03-07T00:55:53.540702447Z" level=info msg="StartContainer for \"0a77b9cd42664656e83c2e32e52c19e4eb3562c5182ffe52609c8bc179b94511\"" Mar 7 00:55:53.605782 systemd[1]: Started cri-containerd-0a77b9cd42664656e83c2e32e52c19e4eb3562c5182ffe52609c8bc179b94511.scope - libcontainer container 0a77b9cd42664656e83c2e32e52c19e4eb3562c5182ffe52609c8bc179b94511. Mar 7 00:55:53.680762 containerd[2022]: time="2026-03-07T00:55:53.680586579Z" level=info msg="StartContainer for \"0a77b9cd42664656e83c2e32e52c19e4eb3562c5182ffe52609c8bc179b94511\" returns successfully" Mar 7 00:55:54.342971 kubelet[3252]: E0307 00:55:54.342903 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:55:54.519186 kubelet[3252]: E0307 00:55:54.519116 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.519529 kubelet[3252]: W0307 00:55:54.519450 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.519662 kubelet[3252]: E0307 00:55:54.519636 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.520207 kubelet[3252]: E0307 00:55:54.520180 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.520339 kubelet[3252]: W0307 00:55:54.520313 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.520472 kubelet[3252]: E0307 00:55:54.520449 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.521032 kubelet[3252]: E0307 00:55:54.521006 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.521295 kubelet[3252]: W0307 00:55:54.521159 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.521295 kubelet[3252]: E0307 00:55:54.521197 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.521832 kubelet[3252]: E0307 00:55:54.521807 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.522158 kubelet[3252]: W0307 00:55:54.522016 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.522158 kubelet[3252]: E0307 00:55:54.522054 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.523215 kubelet[3252]: E0307 00:55:54.523028 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.525111 kubelet[3252]: W0307 00:55:54.524434 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.525111 kubelet[3252]: E0307 00:55:54.524495 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.525862 kubelet[3252]: E0307 00:55:54.525550 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.525862 kubelet[3252]: W0307 00:55:54.525581 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.525862 kubelet[3252]: E0307 00:55:54.525616 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.526754 kubelet[3252]: E0307 00:55:54.526467 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.526754 kubelet[3252]: W0307 00:55:54.526499 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.526754 kubelet[3252]: E0307 00:55:54.526562 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.527655 kubelet[3252]: E0307 00:55:54.527273 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.527655 kubelet[3252]: W0307 00:55:54.527298 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.527655 kubelet[3252]: E0307 00:55:54.527324 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.527935 kubelet[3252]: E0307 00:55:54.527897 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.528033 kubelet[3252]: W0307 00:55:54.527934 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.528033 kubelet[3252]: E0307 00:55:54.527962 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.528293 kubelet[3252]: E0307 00:55:54.528266 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.528354 kubelet[3252]: W0307 00:55:54.528293 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.528354 kubelet[3252]: E0307 00:55:54.528316 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.528673 kubelet[3252]: E0307 00:55:54.528644 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.528748 kubelet[3252]: W0307 00:55:54.528672 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.528748 kubelet[3252]: E0307 00:55:54.528695 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.529043 kubelet[3252]: E0307 00:55:54.529016 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.529106 kubelet[3252]: W0307 00:55:54.529042 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.529106 kubelet[3252]: E0307 00:55:54.529064 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.531283 kubelet[3252]: E0307 00:55:54.531101 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.531283 kubelet[3252]: W0307 00:55:54.531143 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.531283 kubelet[3252]: E0307 00:55:54.531198 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.533148 kubelet[3252]: E0307 00:55:54.533094 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.533148 kubelet[3252]: W0307 00:55:54.533139 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.533352 kubelet[3252]: E0307 00:55:54.533180 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.536407 kubelet[3252]: E0307 00:55:54.535240 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.536407 kubelet[3252]: W0307 00:55:54.535286 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.536407 kubelet[3252]: E0307 00:55:54.535322 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.537299 kubelet[3252]: I0307 00:55:54.535903 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-77867d5cb5-xgfn5" podStartSLOduration=2.125684964 podStartE2EDuration="4.535887052s" podCreationTimestamp="2026-03-07 00:55:50 +0000 UTC" firstStartedPulling="2026-03-07 00:55:51.045843694 +0000 UTC m=+29.039058337" lastFinishedPulling="2026-03-07 00:55:53.456045782 +0000 UTC m=+31.449260425" observedRunningTime="2026-03-07 00:55:54.531795292 +0000 UTC m=+32.525009971" watchObservedRunningTime="2026-03-07 00:55:54.535887052 +0000 UTC m=+32.529101695" Mar 7 00:55:54.575506 kubelet[3252]: E0307 00:55:54.575462 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.575506 kubelet[3252]: W0307 00:55:54.575499 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.575817 kubelet[3252]: E0307 00:55:54.575535 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.575974 kubelet[3252]: E0307 00:55:54.575943 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.576053 kubelet[3252]: W0307 00:55:54.575974 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.576053 kubelet[3252]: E0307 00:55:54.576002 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.576401 kubelet[3252]: E0307 00:55:54.576359 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.576481 kubelet[3252]: W0307 00:55:54.576404 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.576481 kubelet[3252]: E0307 00:55:54.576429 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.577137 kubelet[3252]: E0307 00:55:54.577102 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.577137 kubelet[3252]: W0307 00:55:54.577136 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.577331 kubelet[3252]: E0307 00:55:54.577165 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.577716 kubelet[3252]: E0307 00:55:54.577657 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.577822 kubelet[3252]: W0307 00:55:54.577715 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.577822 kubelet[3252]: E0307 00:55:54.577744 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.578413 kubelet[3252]: E0307 00:55:54.578354 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.578597 kubelet[3252]: W0307 00:55:54.578548 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.578674 kubelet[3252]: E0307 00:55:54.578596 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.579265 kubelet[3252]: E0307 00:55:54.579228 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.579265 kubelet[3252]: W0307 00:55:54.579262 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.579559 kubelet[3252]: E0307 00:55:54.579289 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.579731 kubelet[3252]: E0307 00:55:54.579700 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.579815 kubelet[3252]: W0307 00:55:54.579731 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.579815 kubelet[3252]: E0307 00:55:54.579760 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.580501 kubelet[3252]: E0307 00:55:54.580450 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.580501 kubelet[3252]: W0307 00:55:54.580487 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.580704 kubelet[3252]: E0307 00:55:54.580517 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.581571 kubelet[3252]: E0307 00:55:54.581532 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.581571 kubelet[3252]: W0307 00:55:54.581567 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.581737 kubelet[3252]: E0307 00:55:54.581599 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.582870 kubelet[3252]: E0307 00:55:54.582828 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.583002 kubelet[3252]: W0307 00:55:54.582874 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.583002 kubelet[3252]: E0307 00:55:54.582909 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.583439 kubelet[3252]: E0307 00:55:54.583365 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.583439 kubelet[3252]: W0307 00:55:54.583408 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.583439 kubelet[3252]: E0307 00:55:54.583434 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.584281 kubelet[3252]: E0307 00:55:54.583807 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.584281 kubelet[3252]: W0307 00:55:54.583826 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.584281 kubelet[3252]: E0307 00:55:54.583847 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.584281 kubelet[3252]: E0307 00:55:54.584129 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.584281 kubelet[3252]: W0307 00:55:54.584145 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.584281 kubelet[3252]: E0307 00:55:54.584164 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.584654 kubelet[3252]: E0307 00:55:54.584544 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.584654 kubelet[3252]: W0307 00:55:54.584565 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.584654 kubelet[3252]: E0307 00:55:54.584586 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.585481 kubelet[3252]: E0307 00:55:54.585367 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.586002 kubelet[3252]: W0307 00:55:54.585655 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.586002 kubelet[3252]: E0307 00:55:54.585720 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.586255 kubelet[3252]: E0307 00:55:54.586230 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.586424 kubelet[3252]: W0307 00:55:54.586365 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.586731 kubelet[3252]: E0307 00:55:54.586615 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.587269 kubelet[3252]: E0307 00:55:54.587181 3252 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:55:54.587269 kubelet[3252]: W0307 00:55:54.587205 3252 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:55:54.587269 kubelet[3252]: E0307 00:55:54.587228 3252 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:55:54.745456 containerd[2022]: time="2026-03-07T00:55:54.744547217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:54.746580 containerd[2022]: time="2026-03-07T00:55:54.746230265Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 7 00:55:54.749445 containerd[2022]: time="2026-03-07T00:55:54.748580117Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:54.754286 containerd[2022]: time="2026-03-07T00:55:54.754224473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:54.755514 containerd[2022]: time="2026-03-07T00:55:54.755459465Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.296718555s" Mar 7 00:55:54.755609 containerd[2022]: time="2026-03-07T00:55:54.755518361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 7 00:55:54.765520 containerd[2022]: time="2026-03-07T00:55:54.765448793Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 00:55:54.796557 containerd[2022]: time="2026-03-07T00:55:54.796476773Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e\"" Mar 7 00:55:54.800283 containerd[2022]: time="2026-03-07T00:55:54.797718461Z" level=info msg="StartContainer for \"4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e\"" Mar 7 00:55:54.862764 systemd[1]: Started cri-containerd-4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e.scope - libcontainer container 4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e. Mar 7 00:55:54.921424 containerd[2022]: time="2026-03-07T00:55:54.921331914Z" level=info msg="StartContainer for \"4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e\" returns successfully" Mar 7 00:55:54.955801 systemd[1]: cri-containerd-4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e.scope: Deactivated successfully. Mar 7 00:55:55.128525 containerd[2022]: time="2026-03-07T00:55:55.128363367Z" level=info msg="shim disconnected" id=4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e namespace=k8s.io Mar 7 00:55:55.128525 containerd[2022]: time="2026-03-07T00:55:55.128476647Z" level=warning msg="cleaning up after shim disconnected" id=4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e namespace=k8s.io Mar 7 00:55:55.128525 containerd[2022]: time="2026-03-07T00:55:55.128499339Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:55:55.475966 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4eb775629a3f557be9020f0dce3192356c23218895e1ddb151ecb10f353e4c1e-rootfs.mount: Deactivated successfully. Mar 7 00:55:55.515865 kubelet[3252]: I0307 00:55:55.515798 3252 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:55:55.520607 containerd[2022]: time="2026-03-07T00:55:55.520472873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 00:55:56.344724 kubelet[3252]: E0307 00:55:56.343443 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:55:58.340839 kubelet[3252]: E0307 00:55:58.340777 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:00.344000 kubelet[3252]: E0307 00:56:00.343933 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:01.812921 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1824849440.mount: Deactivated successfully. Mar 7 00:56:01.877248 containerd[2022]: time="2026-03-07T00:56:01.877185132Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:01.879407 containerd[2022]: time="2026-03-07T00:56:01.879319212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 7 00:56:01.882623 containerd[2022]: time="2026-03-07T00:56:01.882556560Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:01.889566 containerd[2022]: time="2026-03-07T00:56:01.889472184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:01.891320 containerd[2022]: time="2026-03-07T00:56:01.891270492Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.370688467s" Mar 7 00:56:01.891520 containerd[2022]: time="2026-03-07T00:56:01.891488460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 7 00:56:01.902024 containerd[2022]: time="2026-03-07T00:56:01.901955412Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 00:56:01.935512 containerd[2022]: time="2026-03-07T00:56:01.935323740Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff\"" Mar 7 00:56:01.938417 containerd[2022]: time="2026-03-07T00:56:01.938306700Z" level=info msg="StartContainer for \"4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff\"" Mar 7 00:56:02.000855 systemd[1]: Started cri-containerd-4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff.scope - libcontainer container 4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff. Mar 7 00:56:02.055076 containerd[2022]: time="2026-03-07T00:56:02.054844893Z" level=info msg="StartContainer for \"4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff\" returns successfully" Mar 7 00:56:02.253935 systemd[1]: cri-containerd-4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff.scope: Deactivated successfully. Mar 7 00:56:02.344243 kubelet[3252]: E0307 00:56:02.342610 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:02.779242 containerd[2022]: time="2026-03-07T00:56:02.779090017Z" level=info msg="shim disconnected" id=4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff namespace=k8s.io Mar 7 00:56:02.779242 containerd[2022]: time="2026-03-07T00:56:02.779189269Z" level=warning msg="cleaning up after shim disconnected" id=4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff namespace=k8s.io Mar 7 00:56:02.779854 containerd[2022]: time="2026-03-07T00:56:02.779211877Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:56:02.811623 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4b0769bdfa57fefa2b5ec96e6e5b656effb9855fe5d560baf07fab7de58f6bff-rootfs.mount: Deactivated successfully. Mar 7 00:56:03.553181 containerd[2022]: time="2026-03-07T00:56:03.553003668Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 00:56:04.341009 kubelet[3252]: E0307 00:56:04.340946 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:06.341209 kubelet[3252]: E0307 00:56:06.341123 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:06.550069 containerd[2022]: time="2026-03-07T00:56:06.549088383Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:06.551438 containerd[2022]: time="2026-03-07T00:56:06.551136711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 7 00:56:06.553633 containerd[2022]: time="2026-03-07T00:56:06.553548099Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:06.559731 containerd[2022]: time="2026-03-07T00:56:06.559637295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:06.561559 containerd[2022]: time="2026-03-07T00:56:06.561345531Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.008057523s" Mar 7 00:56:06.561559 containerd[2022]: time="2026-03-07T00:56:06.561420687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 7 00:56:06.571363 containerd[2022]: time="2026-03-07T00:56:06.571309695Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 00:56:06.609771 containerd[2022]: time="2026-03-07T00:56:06.609689872Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d\"" Mar 7 00:56:06.613137 containerd[2022]: time="2026-03-07T00:56:06.610814476Z" level=info msg="StartContainer for \"5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d\"" Mar 7 00:56:06.672727 systemd[1]: Started cri-containerd-5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d.scope - libcontainer container 5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d. Mar 7 00:56:06.735039 containerd[2022]: time="2026-03-07T00:56:06.734873176Z" level=info msg="StartContainer for \"5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d\" returns successfully" Mar 7 00:56:08.341989 kubelet[3252]: E0307 00:56:08.341829 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tgsjl" podUID="c78279c1-73d2-4b1a-bfd7-356a67561cbc" Mar 7 00:56:08.633042 systemd[1]: cri-containerd-5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d.scope: Deactivated successfully. Mar 7 00:56:08.633559 systemd[1]: cri-containerd-5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d.scope: Consumed 1.041s CPU time. Mar 7 00:56:08.680797 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d-rootfs.mount: Deactivated successfully. Mar 7 00:56:08.689838 containerd[2022]: time="2026-03-07T00:56:08.689734962Z" level=info msg="shim disconnected" id=5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d namespace=k8s.io Mar 7 00:56:08.690648 containerd[2022]: time="2026-03-07T00:56:08.689835186Z" level=warning msg="cleaning up after shim disconnected" id=5be4b7d05cd9200525d8fa7f56a5f32facc8d2de9ec5f57ad9f64bb2e6aa7f9d namespace=k8s.io Mar 7 00:56:08.690648 containerd[2022]: time="2026-03-07T00:56:08.689859366Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:56:08.721540 containerd[2022]: time="2026-03-07T00:56:08.718661346Z" level=warning msg="cleanup warnings time=\"2026-03-07T00:56:08Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 7 00:56:08.724210 kubelet[3252]: I0307 00:56:08.722180 3252 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 7 00:56:08.820773 systemd[1]: Created slice kubepods-burstable-podd42faf4e_65f6_4396_aa30_a3b1cd2cc8f9.slice - libcontainer container kubepods-burstable-podd42faf4e_65f6_4396_aa30_a3b1cd2cc8f9.slice. Mar 7 00:56:08.852724 systemd[1]: Created slice kubepods-burstable-pod6a33fd83_0f26_4bf6_bbf7_f61c96127d50.slice - libcontainer container kubepods-burstable-pod6a33fd83_0f26_4bf6_bbf7_f61c96127d50.slice. Mar 7 00:56:08.879573 systemd[1]: Created slice kubepods-besteffort-pod695f6498_15b6_4241_b99a_44f0028694da.slice - libcontainer container kubepods-besteffort-pod695f6498_15b6_4241_b99a_44f0028694da.slice. Mar 7 00:56:08.898849 systemd[1]: Created slice kubepods-besteffort-pod1d414626_8c38_4470_b252_d2b0f7ae78d8.slice - libcontainer container kubepods-besteffort-pod1d414626_8c38_4470_b252_d2b0f7ae78d8.slice. Mar 7 00:56:08.900376 kubelet[3252]: I0307 00:56:08.900288 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5cv\" (UniqueName: \"kubernetes.io/projected/1d414626-8c38-4470-b252-d2b0f7ae78d8-kube-api-access-dj5cv\") pod \"calico-apiserver-7fc9dd7d97-2bt9t\" (UID: \"1d414626-8c38-4470-b252-d2b0f7ae78d8\") " pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" Mar 7 00:56:08.902172 kubelet[3252]: I0307 00:56:08.900372 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6d62a8f1-aabe-4464-9392-047b694f64ae-calico-apiserver-certs\") pod \"calico-apiserver-7fc9dd7d97-2zgnf\" (UID: \"6d62a8f1-aabe-4464-9392-047b694f64ae\") " pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" Mar 7 00:56:08.902172 kubelet[3252]: I0307 00:56:08.900563 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-plrrb\" (UID: \"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0\") " pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:08.902172 kubelet[3252]: I0307 00:56:08.900651 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppl5s\" (UniqueName: \"kubernetes.io/projected/6a33fd83-0f26-4bf6-bbf7-f61c96127d50-kube-api-access-ppl5s\") pod \"coredns-7d764666f9-xg55x\" (UID: \"6a33fd83-0f26-4bf6-bbf7-f61c96127d50\") " pod="kube-system/coredns-7d764666f9-xg55x" Mar 7 00:56:08.902172 kubelet[3252]: I0307 00:56:08.900699 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/695f6498-15b6-4241-b99a-44f0028694da-tigera-ca-bundle\") pod \"calico-kube-controllers-7dd69967b5-wbk4v\" (UID: \"695f6498-15b6-4241-b99a-44f0028694da\") " pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" Mar 7 00:56:08.902172 kubelet[3252]: I0307 00:56:08.900740 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1d414626-8c38-4470-b252-d2b0f7ae78d8-calico-apiserver-certs\") pod \"calico-apiserver-7fc9dd7d97-2bt9t\" (UID: \"1d414626-8c38-4470-b252-d2b0f7ae78d8\") " pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" Mar 7 00:56:08.902617 kubelet[3252]: I0307 00:56:08.900782 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7fm\" (UniqueName: \"kubernetes.io/projected/56938f35-7cb7-4754-962f-9b675f12bee1-kube-api-access-9w7fm\") pod \"whisker-7cb7dc8756-k6flf\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:08.902617 kubelet[3252]: I0307 00:56:08.900830 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snh7\" (UniqueName: \"kubernetes.io/projected/695f6498-15b6-4241-b99a-44f0028694da-kube-api-access-8snh7\") pod \"calico-kube-controllers-7dd69967b5-wbk4v\" (UID: \"695f6498-15b6-4241-b99a-44f0028694da\") " pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" Mar 7 00:56:08.902617 kubelet[3252]: I0307 00:56:08.900878 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a33fd83-0f26-4bf6-bbf7-f61c96127d50-config-volume\") pod \"coredns-7d764666f9-xg55x\" (UID: \"6a33fd83-0f26-4bf6-bbf7-f61c96127d50\") " pod="kube-system/coredns-7d764666f9-xg55x" Mar 7 00:56:08.902617 kubelet[3252]: I0307 00:56:08.900917 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-backend-key-pair\") pod \"whisker-7cb7dc8756-k6flf\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:08.902617 kubelet[3252]: I0307 00:56:08.900954 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0-config\") pod \"goldmane-9f7667bb8-plrrb\" (UID: \"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0\") " pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:08.902931 kubelet[3252]: I0307 00:56:08.900998 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-ca-bundle\") pod \"whisker-7cb7dc8756-k6flf\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:08.902931 kubelet[3252]: I0307 00:56:08.901054 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjgbz\" (UniqueName: \"kubernetes.io/projected/6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0-kube-api-access-fjgbz\") pod \"goldmane-9f7667bb8-plrrb\" (UID: \"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0\") " pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:08.902931 kubelet[3252]: I0307 00:56:08.901106 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-nginx-config\") pod \"whisker-7cb7dc8756-k6flf\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:08.902931 kubelet[3252]: I0307 00:56:08.901143 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0-goldmane-key-pair\") pod \"goldmane-9f7667bb8-plrrb\" (UID: \"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0\") " pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:08.902931 kubelet[3252]: I0307 00:56:08.901187 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxp6\" (UniqueName: \"kubernetes.io/projected/6d62a8f1-aabe-4464-9392-047b694f64ae-kube-api-access-prxp6\") pod \"calico-apiserver-7fc9dd7d97-2zgnf\" (UID: \"6d62a8f1-aabe-4464-9392-047b694f64ae\") " pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" Mar 7 00:56:08.903238 kubelet[3252]: I0307 00:56:08.901232 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9-config-volume\") pod \"coredns-7d764666f9-xw2lb\" (UID: \"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9\") " pod="kube-system/coredns-7d764666f9-xw2lb" Mar 7 00:56:08.903238 kubelet[3252]: I0307 00:56:08.901275 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jnb\" (UniqueName: \"kubernetes.io/projected/d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9-kube-api-access-d6jnb\") pod \"coredns-7d764666f9-xw2lb\" (UID: \"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9\") " pod="kube-system/coredns-7d764666f9-xw2lb" Mar 7 00:56:08.926771 systemd[1]: Created slice kubepods-besteffort-pod6d62a8f1_aabe_4464_9392_047b694f64ae.slice - libcontainer container kubepods-besteffort-pod6d62a8f1_aabe_4464_9392_047b694f64ae.slice. Mar 7 00:56:08.952268 systemd[1]: Created slice kubepods-besteffort-pod6ae6eb1c_c1a0_42a2_9357_aea0ab8fecf0.slice - libcontainer container kubepods-besteffort-pod6ae6eb1c_c1a0_42a2_9357_aea0ab8fecf0.slice. Mar 7 00:56:08.967132 systemd[1]: Created slice kubepods-besteffort-pod56938f35_7cb7_4754_962f_9b675f12bee1.slice - libcontainer container kubepods-besteffort-pod56938f35_7cb7_4754_962f_9b675f12bee1.slice. Mar 7 00:56:09.148425 containerd[2022]: time="2026-03-07T00:56:09.148042672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xw2lb,Uid:d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9,Namespace:kube-system,Attempt:0,}" Mar 7 00:56:09.171563 containerd[2022]: time="2026-03-07T00:56:09.169330816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xg55x,Uid:6a33fd83-0f26-4bf6-bbf7-f61c96127d50,Namespace:kube-system,Attempt:0,}" Mar 7 00:56:09.194910 containerd[2022]: time="2026-03-07T00:56:09.194851600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd69967b5-wbk4v,Uid:695f6498-15b6-4241-b99a-44f0028694da,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:09.227038 containerd[2022]: time="2026-03-07T00:56:09.226882157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2bt9t,Uid:1d414626-8c38-4470-b252-d2b0f7ae78d8,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:09.248313 containerd[2022]: time="2026-03-07T00:56:09.247731293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2zgnf,Uid:6d62a8f1-aabe-4464-9392-047b694f64ae,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:09.268617 containerd[2022]: time="2026-03-07T00:56:09.268547597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-plrrb,Uid:6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:09.285063 containerd[2022]: time="2026-03-07T00:56:09.284786561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cb7dc8756-k6flf,Uid:56938f35-7cb7-4754-962f-9b675f12bee1,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:09.647662 containerd[2022]: time="2026-03-07T00:56:09.647569279Z" level=error msg="Failed to destroy network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.658042 containerd[2022]: time="2026-03-07T00:56:09.657917695Z" level=error msg="encountered an error cleaning up failed sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.659470 containerd[2022]: time="2026-03-07T00:56:09.658053643Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xg55x,Uid:6a33fd83-0f26-4bf6-bbf7-f61c96127d50,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.667058 containerd[2022]: time="2026-03-07T00:56:09.665902423Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 00:56:09.670603 kubelet[3252]: E0307 00:56:09.670426 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.672423 kubelet[3252]: E0307 00:56:09.670571 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xg55x" Mar 7 00:56:09.672423 kubelet[3252]: E0307 00:56:09.671346 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xg55x" Mar 7 00:56:09.672423 kubelet[3252]: E0307 00:56:09.671528 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-xg55x_kube-system(6a33fd83-0f26-4bf6-bbf7-f61c96127d50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-xg55x_kube-system(6a33fd83-0f26-4bf6-bbf7-f61c96127d50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-xg55x" podUID="6a33fd83-0f26-4bf6-bbf7-f61c96127d50" Mar 7 00:56:09.733196 containerd[2022]: time="2026-03-07T00:56:09.733084243Z" level=error msg="Failed to destroy network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.747887 containerd[2022]: time="2026-03-07T00:56:09.746217079Z" level=error msg="encountered an error cleaning up failed sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.747887 containerd[2022]: time="2026-03-07T00:56:09.746352247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xw2lb,Uid:d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.749716 kubelet[3252]: E0307 00:56:09.749646 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.749903 kubelet[3252]: E0307 00:56:09.749733 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xw2lb" Mar 7 00:56:09.749903 kubelet[3252]: E0307 00:56:09.749772 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xw2lb" Mar 7 00:56:09.749903 kubelet[3252]: E0307 00:56:09.749866 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-xw2lb_kube-system(d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-xw2lb_kube-system(d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-xw2lb" podUID="d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9" Mar 7 00:56:09.792649 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0-shm.mount: Deactivated successfully. Mar 7 00:56:09.805839 containerd[2022]: time="2026-03-07T00:56:09.804809108Z" level=info msg="CreateContainer within sandbox \"8a5fb3f59cf248c7f04f4cf75b8542190b7df07b0a50b2ac8991370be569f490\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a\"" Mar 7 00:56:09.807715 containerd[2022]: time="2026-03-07T00:56:09.807312680Z" level=info msg="StartContainer for \"c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a\"" Mar 7 00:56:09.838909 containerd[2022]: time="2026-03-07T00:56:09.838677488Z" level=error msg="Failed to destroy network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.842599 containerd[2022]: time="2026-03-07T00:56:09.842045156Z" level=error msg="encountered an error cleaning up failed sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.843639 containerd[2022]: time="2026-03-07T00:56:09.842831672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd69967b5-wbk4v,Uid:695f6498-15b6-4241-b99a-44f0028694da,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.844024 containerd[2022]: time="2026-03-07T00:56:09.843155948Z" level=error msg="Failed to destroy network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.845469 kubelet[3252]: E0307 00:56:09.844907 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.845469 kubelet[3252]: E0307 00:56:09.844986 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" Mar 7 00:56:09.845469 kubelet[3252]: E0307 00:56:09.845018 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" Mar 7 00:56:09.845751 kubelet[3252]: E0307 00:56:09.845104 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7dd69967b5-wbk4v_calico-system(695f6498-15b6-4241-b99a-44f0028694da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7dd69967b5-wbk4v_calico-system(695f6498-15b6-4241-b99a-44f0028694da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" podUID="695f6498-15b6-4241-b99a-44f0028694da" Mar 7 00:56:09.846448 containerd[2022]: time="2026-03-07T00:56:09.845340452Z" level=error msg="encountered an error cleaning up failed sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.846448 containerd[2022]: time="2026-03-07T00:56:09.845961548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2zgnf,Uid:6d62a8f1-aabe-4464-9392-047b694f64ae,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.850925 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c-shm.mount: Deactivated successfully. Mar 7 00:56:09.852112 kubelet[3252]: E0307 00:56:09.851235 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.852112 kubelet[3252]: E0307 00:56:09.851310 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" Mar 7 00:56:09.852112 kubelet[3252]: E0307 00:56:09.851341 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" Mar 7 00:56:09.854196 kubelet[3252]: E0307 00:56:09.851707 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fc9dd7d97-2zgnf_calico-system(6d62a8f1-aabe-4464-9392-047b694f64ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fc9dd7d97-2zgnf_calico-system(6d62a8f1-aabe-4464-9392-047b694f64ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" podUID="6d62a8f1-aabe-4464-9392-047b694f64ae" Mar 7 00:56:09.867045 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e-shm.mount: Deactivated successfully. Mar 7 00:56:09.873232 containerd[2022]: time="2026-03-07T00:56:09.872793464Z" level=error msg="Failed to destroy network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.880877 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074-shm.mount: Deactivated successfully. Mar 7 00:56:09.883540 containerd[2022]: time="2026-03-07T00:56:09.883061048Z" level=error msg="encountered an error cleaning up failed sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.883540 containerd[2022]: time="2026-03-07T00:56:09.883160900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2bt9t,Uid:1d414626-8c38-4470-b252-d2b0f7ae78d8,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.884260 kubelet[3252]: E0307 00:56:09.883964 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.884260 kubelet[3252]: E0307 00:56:09.884074 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" Mar 7 00:56:09.885310 kubelet[3252]: E0307 00:56:09.884532 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" Mar 7 00:56:09.886663 kubelet[3252]: E0307 00:56:09.885544 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fc9dd7d97-2bt9t_calico-system(1d414626-8c38-4470-b252-d2b0f7ae78d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fc9dd7d97-2bt9t_calico-system(1d414626-8c38-4470-b252-d2b0f7ae78d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" podUID="1d414626-8c38-4470-b252-d2b0f7ae78d8" Mar 7 00:56:09.921443 containerd[2022]: time="2026-03-07T00:56:09.920893052Z" level=error msg="Failed to destroy network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.924306 containerd[2022]: time="2026-03-07T00:56:09.924227312Z" level=error msg="encountered an error cleaning up failed sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.924969 containerd[2022]: time="2026-03-07T00:56:09.924898376Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-plrrb,Uid:6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.925582 kubelet[3252]: E0307 00:56:09.925260 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.925582 kubelet[3252]: E0307 00:56:09.925336 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:09.928355 kubelet[3252]: E0307 00:56:09.925369 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-plrrb" Mar 7 00:56:09.928355 kubelet[3252]: E0307 00:56:09.925882 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-plrrb_calico-system(6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-plrrb_calico-system(6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-plrrb" podUID="6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0" Mar 7 00:56:09.937269 containerd[2022]: time="2026-03-07T00:56:09.937177652Z" level=error msg="Failed to destroy network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.938036 containerd[2022]: time="2026-03-07T00:56:09.937959116Z" level=error msg="encountered an error cleaning up failed sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.938153 containerd[2022]: time="2026-03-07T00:56:09.938055008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cb7dc8756-k6flf,Uid:56938f35-7cb7-4754-962f-9b675f12bee1,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.939206 kubelet[3252]: E0307 00:56:09.938482 3252 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:09.939206 kubelet[3252]: E0307 00:56:09.938556 3252 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:09.939206 kubelet[3252]: E0307 00:56:09.938598 3252 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cb7dc8756-k6flf" Mar 7 00:56:09.940311 kubelet[3252]: E0307 00:56:09.938684 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7cb7dc8756-k6flf_calico-system(56938f35-7cb7-4754-962f-9b675f12bee1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7cb7dc8756-k6flf_calico-system(56938f35-7cb7-4754-962f-9b675f12bee1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7cb7dc8756-k6flf" podUID="56938f35-7cb7-4754-962f-9b675f12bee1" Mar 7 00:56:09.958745 systemd[1]: Started cri-containerd-c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a.scope - libcontainer container c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a. Mar 7 00:56:10.028899 containerd[2022]: time="2026-03-07T00:56:10.028217369Z" level=info msg="StartContainer for \"c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a\" returns successfully" Mar 7 00:56:10.364501 systemd[1]: Created slice kubepods-besteffort-podc78279c1_73d2_4b1a_bfd7_356a67561cbc.slice - libcontainer container kubepods-besteffort-podc78279c1_73d2_4b1a_bfd7_356a67561cbc.slice. Mar 7 00:56:10.376251 containerd[2022]: time="2026-03-07T00:56:10.376176390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tgsjl,Uid:c78279c1-73d2-4b1a-bfd7-356a67561cbc,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:10.606555 kubelet[3252]: I0307 00:56:10.605374 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:10.607114 containerd[2022]: time="2026-03-07T00:56:10.607069723Z" level=info msg="StopPodSandbox for \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\"" Mar 7 00:56:10.607668 containerd[2022]: time="2026-03-07T00:56:10.607614115Z" level=info msg="Ensure that sandbox 522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341 in task-service has been cleanup successfully" Mar 7 00:56:10.613677 kubelet[3252]: I0307 00:56:10.613609 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:10.617222 containerd[2022]: time="2026-03-07T00:56:10.616192448Z" level=info msg="StopPodSandbox for \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\"" Mar 7 00:56:10.619942 containerd[2022]: time="2026-03-07T00:56:10.619830548Z" level=info msg="Ensure that sandbox 7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb in task-service has been cleanup successfully" Mar 7 00:56:10.625564 kubelet[3252]: I0307 00:56:10.625507 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:10.627993 containerd[2022]: time="2026-03-07T00:56:10.627920816Z" level=info msg="StopPodSandbox for \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\"" Mar 7 00:56:10.629442 containerd[2022]: time="2026-03-07T00:56:10.629362604Z" level=info msg="Ensure that sandbox 138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42 in task-service has been cleanup successfully" Mar 7 00:56:10.691738 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341-shm.mount: Deactivated successfully. Mar 7 00:56:10.691941 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb-shm.mount: Deactivated successfully. Mar 7 00:56:10.717633 kubelet[3252]: I0307 00:56:10.717199 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:10.734236 containerd[2022]: time="2026-03-07T00:56:10.732271280Z" level=info msg="StopPodSandbox for \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\"" Mar 7 00:56:10.734236 containerd[2022]: time="2026-03-07T00:56:10.733003892Z" level=info msg="Ensure that sandbox 092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e in task-service has been cleanup successfully" Mar 7 00:56:10.786367 kubelet[3252]: I0307 00:56:10.786284 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:10.794528 containerd[2022]: time="2026-03-07T00:56:10.794459996Z" level=info msg="StopPodSandbox for \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\"" Mar 7 00:56:10.796562 containerd[2022]: time="2026-03-07T00:56:10.796364480Z" level=info msg="Ensure that sandbox 62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074 in task-service has been cleanup successfully" Mar 7 00:56:10.813684 kubelet[3252]: I0307 00:56:10.813617 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:10.824768 containerd[2022]: time="2026-03-07T00:56:10.823311033Z" level=info msg="StopPodSandbox for \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\"" Mar 7 00:56:10.824768 containerd[2022]: time="2026-03-07T00:56:10.823761849Z" level=info msg="Ensure that sandbox 2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c in task-service has been cleanup successfully" Mar 7 00:56:10.892003 kubelet[3252]: I0307 00:56:10.891164 3252 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:10.895952 containerd[2022]: time="2026-03-07T00:56:10.895874481Z" level=info msg="StopPodSandbox for \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\"" Mar 7 00:56:10.896326 containerd[2022]: time="2026-03-07T00:56:10.896272821Z" level=info msg="Ensure that sandbox c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0 in task-service has been cleanup successfully" Mar 7 00:56:10.940300 systemd[1]: run-containerd-runc-k8s.io-c61997a4edac78d327dab91fdd33ca127a2bdac0cb9d0fe8dea48bfba9c9302a-runc.Jh6ykS.mount: Deactivated successfully. Mar 7 00:56:11.271118 kubelet[3252]: I0307 00:56:11.269958 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-kv87x" podStartSLOduration=2.744578115 podStartE2EDuration="21.269928007s" podCreationTimestamp="2026-03-07 00:55:50 +0000 UTC" firstStartedPulling="2026-03-07 00:55:51.096942395 +0000 UTC m=+29.090157038" lastFinishedPulling="2026-03-07 00:56:09.622292263 +0000 UTC m=+47.615506930" observedRunningTime="2026-03-07 00:56:10.774725816 +0000 UTC m=+48.767940747" watchObservedRunningTime="2026-03-07 00:56:11.269928007 +0000 UTC m=+49.263142650" Mar 7 00:56:11.289733 systemd-networkd[1942]: cali929677d6584: Link UP Mar 7 00:56:11.297960 (udev-worker)[4801]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:56:11.304782 systemd-networkd[1942]: cali929677d6584: Gained carrier Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.458 [ERROR][4645] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.505 [INFO][4645] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0 csi-node-driver- calico-system c78279c1-73d2-4b1a-bfd7-356a67561cbc 725 0 2026-03-07 00:55:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-23-166 csi-node-driver-tgsjl eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali929677d6584 [] [] }} ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.506 [INFO][4645] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.852 [INFO][4662] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" HandleID="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Workload="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.971 [INFO][4662] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" HandleID="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Workload="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000452aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"csi-node-driver-tgsjl", "timestamp":"2026-03-07 00:56:10.852253509 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003f02c0)} Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.971 [INFO][4662] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.971 [INFO][4662] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.971 [INFO][4662] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:10.989 [INFO][4662] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.021 [INFO][4662] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.056 [INFO][4662] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.064 [INFO][4662] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.081 [INFO][4662] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.083 [INFO][4662] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.104 [INFO][4662] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.142 [INFO][4662] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.180 [INFO][4662] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.1/26] block=192.168.93.0/26 handle="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.184 [INFO][4662] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.1/26] handle="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" host="ip-172-31-23-166" Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.184 [INFO][4662] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:11.459257 containerd[2022]: 2026-03-07 00:56:11.184 [INFO][4662] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.1/26] IPv6=[] ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" HandleID="k8s-pod-network.8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Workload="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.235 [INFO][4645] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c78279c1-73d2-4b1a-bfd7-356a67561cbc", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"csi-node-driver-tgsjl", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.93.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali929677d6584", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.235 [INFO][4645] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.1/32] ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.235 [INFO][4645] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali929677d6584 ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.317 [INFO][4645] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.327 [INFO][4645] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c78279c1-73d2-4b1a-bfd7-356a67561cbc", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a", Pod:"csi-node-driver-tgsjl", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.93.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali929677d6584", MAC:"fa:1b:d0:36:de:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:11.465141 containerd[2022]: 2026-03-07 00:56:11.437 [INFO][4645] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a" Namespace="calico-system" Pod="csi-node-driver-tgsjl" WorkloadEndpoint="ip--172--31--23--166-k8s-csi--node--driver--tgsjl-eth0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.277 [INFO][4689] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.286 [INFO][4689] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" iface="eth0" netns="/var/run/netns/cni-d3c3fdac-a41a-df8a-1424-1c148bcf77b0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.291 [INFO][4689] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" iface="eth0" netns="/var/run/netns/cni-d3c3fdac-a41a-df8a-1424-1c148bcf77b0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.298 [INFO][4689] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" iface="eth0" netns="/var/run/netns/cni-d3c3fdac-a41a-df8a-1424-1c148bcf77b0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.298 [INFO][4689] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.298 [INFO][4689] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.656 [INFO][4803] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.667 [INFO][4803] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.667 [INFO][4803] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.712 [WARNING][4803] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.713 [INFO][4803] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.736 [INFO][4803] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:11.788315 containerd[2022]: 2026-03-07 00:56:11.761 [INFO][4689] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:11.798604 systemd[1]: run-netns-cni\x2dd3c3fdac\x2da41a\x2ddf8a\x2d1424\x2d1c148bcf77b0.mount: Deactivated successfully. Mar 7 00:56:11.802406 containerd[2022]: time="2026-03-07T00:56:11.802113369Z" level=info msg="TearDown network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" successfully" Mar 7 00:56:11.802406 containerd[2022]: time="2026-03-07T00:56:11.802165713Z" level=info msg="StopPodSandbox for \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" returns successfully" Mar 7 00:56:11.828437 containerd[2022]: time="2026-03-07T00:56:11.826875730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-plrrb,Uid:6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.559 [INFO][4730] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.559 [INFO][4730] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" iface="eth0" netns="/var/run/netns/cni-7c8cbe50-e040-76c0-7452-ee44fd17c1c9" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.562 [INFO][4730] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" iface="eth0" netns="/var/run/netns/cni-7c8cbe50-e040-76c0-7452-ee44fd17c1c9" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.569 [INFO][4730] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" iface="eth0" netns="/var/run/netns/cni-7c8cbe50-e040-76c0-7452-ee44fd17c1c9" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.569 [INFO][4730] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.569 [INFO][4730] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.743 [INFO][4849] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.755 [INFO][4849] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.755 [INFO][4849] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.809 [WARNING][4849] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.809 [INFO][4849] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.815 [INFO][4849] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:11.851571 containerd[2022]: 2026-03-07 00:56:11.841 [INFO][4730] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:11.860222 containerd[2022]: time="2026-03-07T00:56:11.859495474Z" level=info msg="TearDown network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" successfully" Mar 7 00:56:11.860222 containerd[2022]: time="2026-03-07T00:56:11.859567306Z" level=info msg="StopPodSandbox for \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" returns successfully" Mar 7 00:56:11.861843 systemd[1]: run-netns-cni\x2d7c8cbe50\x2de040\x2d76c0\x2d7452\x2dee44fd17c1c9.mount: Deactivated successfully. Mar 7 00:56:11.871573 containerd[2022]: time="2026-03-07T00:56:11.870572446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2zgnf,Uid:6d62a8f1-aabe-4464-9392-047b694f64ae,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:11.873249 containerd[2022]: time="2026-03-07T00:56:11.873087610Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:11.873249 containerd[2022]: time="2026-03-07T00:56:11.873205870Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:11.875460 containerd[2022]: time="2026-03-07T00:56:11.875199718Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:11.880829 containerd[2022]: time="2026-03-07T00:56:11.880376482Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.292 [INFO][4690] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.295 [INFO][4690] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" iface="eth0" netns="/var/run/netns/cni-9ccbf9c2-5e62-89de-1393-8d9a4d5abd5c" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.303 [INFO][4690] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" iface="eth0" netns="/var/run/netns/cni-9ccbf9c2-5e62-89de-1393-8d9a4d5abd5c" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.307 [INFO][4690] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" iface="eth0" netns="/var/run/netns/cni-9ccbf9c2-5e62-89de-1393-8d9a4d5abd5c" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.307 [INFO][4690] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.308 [INFO][4690] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.807 [INFO][4806] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.807 [INFO][4806] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.815 [INFO][4806] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.866 [WARNING][4806] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.866 [INFO][4806] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.875 [INFO][4806] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:11.896897 containerd[2022]: 2026-03-07 00:56:11.886 [INFO][4690] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:11.901680 containerd[2022]: time="2026-03-07T00:56:11.901430026Z" level=info msg="TearDown network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" successfully" Mar 7 00:56:11.901680 containerd[2022]: time="2026-03-07T00:56:11.901525894Z" level=info msg="StopPodSandbox for \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" returns successfully" Mar 7 00:56:11.907121 systemd[1]: run-netns-cni\x2d9ccbf9c2\x2d5e62\x2d89de\x2d1393\x2d8d9a4d5abd5c.mount: Deactivated successfully. Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.467 [INFO][4750] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.467 [INFO][4750] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" iface="eth0" netns="/var/run/netns/cni-e8160f16-ca72-b94f-7e4e-435c809baec9" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.470 [INFO][4750] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" iface="eth0" netns="/var/run/netns/cni-e8160f16-ca72-b94f-7e4e-435c809baec9" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.481 [INFO][4750] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" iface="eth0" netns="/var/run/netns/cni-e8160f16-ca72-b94f-7e4e-435c809baec9" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.485 [INFO][4750] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.486 [INFO][4750] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.816 [INFO][4829] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.819 [INFO][4829] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.876 [INFO][4829] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.933 [WARNING][4829] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.933 [INFO][4829] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.951 [INFO][4829] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:11.986831 containerd[2022]: 2026-03-07 00:56:11.975 [INFO][4750] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:11.988655 containerd[2022]: time="2026-03-07T00:56:11.988606258Z" level=info msg="TearDown network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" successfully" Mar 7 00:56:11.988792 containerd[2022]: time="2026-03-07T00:56:11.988763146Z" level=info msg="StopPodSandbox for \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" returns successfully" Mar 7 00:56:11.996551 containerd[2022]: time="2026-03-07T00:56:11.996484762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2bt9t,Uid:1d414626-8c38-4470-b252-d2b0f7ae78d8,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.475 [INFO][4756] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.480 [INFO][4756] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" iface="eth0" netns="/var/run/netns/cni-f4eda54c-d241-2b85-c895-2d84a7fb49db" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.481 [INFO][4756] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" iface="eth0" netns="/var/run/netns/cni-f4eda54c-d241-2b85-c895-2d84a7fb49db" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.483 [INFO][4756] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" iface="eth0" netns="/var/run/netns/cni-f4eda54c-d241-2b85-c895-2d84a7fb49db" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.488 [INFO][4756] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.488 [INFO][4756] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.861 [INFO][4830] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.862 [INFO][4830] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.952 [INFO][4830] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.995 [WARNING][4830] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:11.996 [INFO][4830] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:12.008 [INFO][4830] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:12.039745 containerd[2022]: 2026-03-07 00:56:12.024 [INFO][4756] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:12.052846 systemd[1]: Started cri-containerd-8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a.scope - libcontainer container 8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a. Mar 7 00:56:12.058507 containerd[2022]: time="2026-03-07T00:56:12.050229487Z" level=info msg="TearDown network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" successfully" Mar 7 00:56:12.058507 containerd[2022]: time="2026-03-07T00:56:12.057817231Z" level=info msg="StopPodSandbox for \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" returns successfully" Mar 7 00:56:12.066692 containerd[2022]: time="2026-03-07T00:56:12.066363367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd69967b5-wbk4v,Uid:695f6498-15b6-4241-b99a-44f0028694da,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:12.079616 kubelet[3252]: I0307 00:56:12.078631 3252 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-backend-key-pair\") pod \"56938f35-7cb7-4754-962f-9b675f12bee1\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " Mar 7 00:56:12.079616 kubelet[3252]: I0307 00:56:12.078909 3252 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-ca-bundle\") pod \"56938f35-7cb7-4754-962f-9b675f12bee1\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " Mar 7 00:56:12.079616 kubelet[3252]: I0307 00:56:12.079110 3252 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/56938f35-7cb7-4754-962f-9b675f12bee1-kube-api-access-9w7fm\" (UniqueName: \"kubernetes.io/projected/56938f35-7cb7-4754-962f-9b675f12bee1-kube-api-access-9w7fm\") pod \"56938f35-7cb7-4754-962f-9b675f12bee1\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " Mar 7 00:56:12.079616 kubelet[3252]: I0307 00:56:12.079167 3252 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-nginx-config\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-nginx-config\") pod \"56938f35-7cb7-4754-962f-9b675f12bee1\" (UID: \"56938f35-7cb7-4754-962f-9b675f12bee1\") " Mar 7 00:56:12.091494 kubelet[3252]: I0307 00:56:12.091040 3252 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-nginx-config" pod "56938f35-7cb7-4754-962f-9b675f12bee1" (UID: "56938f35-7cb7-4754-962f-9b675f12bee1"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:56:12.093989 kubelet[3252]: I0307 00:56:12.092598 3252 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-ca-bundle" pod "56938f35-7cb7-4754-962f-9b675f12bee1" (UID: "56938f35-7cb7-4754-962f-9b675f12bee1"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:56:12.098572 kubelet[3252]: I0307 00:56:12.098195 3252 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-backend-key-pair" pod "56938f35-7cb7-4754-962f-9b675f12bee1" (UID: "56938f35-7cb7-4754-962f-9b675f12bee1"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 00:56:12.102439 kubelet[3252]: I0307 00:56:12.102101 3252 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56938f35-7cb7-4754-962f-9b675f12bee1-kube-api-access-9w7fm" pod "56938f35-7cb7-4754-962f-9b675f12bee1" (UID: "56938f35-7cb7-4754-962f-9b675f12bee1"). InnerVolumeSpecName "kube-api-access-9w7fm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.309 [INFO][4702] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.315 [INFO][4702] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" iface="eth0" netns="/var/run/netns/cni-58c6c8e4-795a-a83e-008b-2c1bf6d8ad1b" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.315 [INFO][4702] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" iface="eth0" netns="/var/run/netns/cni-58c6c8e4-795a-a83e-008b-2c1bf6d8ad1b" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.321 [INFO][4702] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" iface="eth0" netns="/var/run/netns/cni-58c6c8e4-795a-a83e-008b-2c1bf6d8ad1b" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.322 [INFO][4702] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.327 [INFO][4702] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.878 [INFO][4809] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:11.879 [INFO][4809] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:12.017 [INFO][4809] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:12.060 [WARNING][4809] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:12.060 [INFO][4809] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:12.068 [INFO][4809] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:12.111968 containerd[2022]: 2026-03-07 00:56:12.098 [INFO][4702] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:12.116675 containerd[2022]: time="2026-03-07T00:56:12.114023467Z" level=info msg="TearDown network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" successfully" Mar 7 00:56:12.117653 containerd[2022]: time="2026-03-07T00:56:12.117525307Z" level=info msg="StopPodSandbox for \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" returns successfully" Mar 7 00:56:12.126869 containerd[2022]: time="2026-03-07T00:56:12.126716095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xg55x,Uid:6a33fd83-0f26-4bf6-bbf7-f61c96127d50,Namespace:kube-system,Attempt:1,}" Mar 7 00:56:12.179943 kubelet[3252]: I0307 00:56:12.179889 3252 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-nginx-config\") on node \"ip-172-31-23-166\" DevicePath \"\"" Mar 7 00:56:12.180339 kubelet[3252]: I0307 00:56:12.180284 3252 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-backend-key-pair\") on node \"ip-172-31-23-166\" DevicePath \"\"" Mar 7 00:56:12.181034 kubelet[3252]: I0307 00:56:12.180987 3252 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56938f35-7cb7-4754-962f-9b675f12bee1-whisker-ca-bundle\") on node \"ip-172-31-23-166\" DevicePath \"\"" Mar 7 00:56:12.181544 kubelet[3252]: I0307 00:56:12.181420 3252 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9w7fm\" (UniqueName: \"kubernetes.io/projected/56938f35-7cb7-4754-962f-9b675f12bee1-kube-api-access-9w7fm\") on node \"ip-172-31-23-166\" DevicePath \"\"" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.534 [INFO][4764] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.534 [INFO][4764] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" iface="eth0" netns="/var/run/netns/cni-cb0c2746-7e50-c720-1f6f-e4d273c5aca2" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.535 [INFO][4764] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" iface="eth0" netns="/var/run/netns/cni-cb0c2746-7e50-c720-1f6f-e4d273c5aca2" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.538 [INFO][4764] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" iface="eth0" netns="/var/run/netns/cni-cb0c2746-7e50-c720-1f6f-e4d273c5aca2" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.538 [INFO][4764] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.538 [INFO][4764] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.971 [INFO][4841] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:11.971 [INFO][4841] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:12.068 [INFO][4841] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:12.119 [WARNING][4841] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:12.119 [INFO][4841] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:12.129 [INFO][4841] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:12.205053 containerd[2022]: 2026-03-07 00:56:12.166 [INFO][4764] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:12.206694 containerd[2022]: time="2026-03-07T00:56:12.206428135Z" level=info msg="TearDown network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" successfully" Mar 7 00:56:12.206694 containerd[2022]: time="2026-03-07T00:56:12.206544787Z" level=info msg="StopPodSandbox for \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" returns successfully" Mar 7 00:56:12.217883 containerd[2022]: time="2026-03-07T00:56:12.217710643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xw2lb,Uid:d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9,Namespace:kube-system,Attempt:1,}" Mar 7 00:56:12.341164 containerd[2022]: time="2026-03-07T00:56:12.340928180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tgsjl,Uid:c78279c1-73d2-4b1a-bfd7-356a67561cbc,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a\"" Mar 7 00:56:12.364551 containerd[2022]: time="2026-03-07T00:56:12.358531472Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 00:56:12.381645 systemd[1]: Removed slice kubepods-besteffort-pod56938f35_7cb7_4754_962f_9b675f12bee1.slice - libcontainer container kubepods-besteffort-pod56938f35_7cb7_4754_962f_9b675f12bee1.slice. Mar 7 00:56:12.826663 systemd[1]: run-netns-cni\x2de8160f16\x2dca72\x2db94f\x2d7e4e\x2d435c809baec9.mount: Deactivated successfully. Mar 7 00:56:12.827179 systemd[1]: run-netns-cni\x2df4eda54c\x2dd241\x2d2b85\x2dc895\x2d2d84a7fb49db.mount: Deactivated successfully. Mar 7 00:56:12.827581 systemd[1]: run-netns-cni\x2d58c6c8e4\x2d795a\x2da83e\x2d008b\x2d2c1bf6d8ad1b.mount: Deactivated successfully. Mar 7 00:56:12.827711 systemd[1]: run-netns-cni\x2dcb0c2746\x2d7e50\x2dc720\x2d1f6f\x2de4d273c5aca2.mount: Deactivated successfully. Mar 7 00:56:12.827848 systemd[1]: var-lib-kubelet-pods-56938f35\x2d7cb7\x2d4754\x2d962f\x2d9b675f12bee1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9w7fm.mount: Deactivated successfully. Mar 7 00:56:12.827989 systemd[1]: var-lib-kubelet-pods-56938f35\x2d7cb7\x2d4754\x2d962f\x2d9b675f12bee1-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 00:56:13.086910 systemd-networkd[1942]: cali7a0b8478fd5: Link UP Mar 7 00:56:13.105807 systemd-networkd[1942]: cali7a0b8478fd5: Gained carrier Mar 7 00:56:13.290282 systemd[1]: Created slice kubepods-besteffort-pod3e626ef7_397b_4834_85cb_dac8d45cf32b.slice - libcontainer container kubepods-besteffort-pod3e626ef7_397b_4834_85cb_dac8d45cf32b.slice. Mar 7 00:56:13.304692 systemd-networkd[1942]: cali929677d6584: Gained IPv6LL Mar 7 00:56:13.315905 kubelet[3252]: I0307 00:56:13.315537 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tllr9\" (UniqueName: \"kubernetes.io/projected/3e626ef7-397b-4834-85cb-dac8d45cf32b-kube-api-access-tllr9\") pod \"whisker-5844d5c4cc-z7mbs\" (UID: \"3e626ef7-397b-4834-85cb-dac8d45cf32b\") " pod="calico-system/whisker-5844d5c4cc-z7mbs" Mar 7 00:56:13.315905 kubelet[3252]: I0307 00:56:13.315630 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e626ef7-397b-4834-85cb-dac8d45cf32b-whisker-backend-key-pair\") pod \"whisker-5844d5c4cc-z7mbs\" (UID: \"3e626ef7-397b-4834-85cb-dac8d45cf32b\") " pod="calico-system/whisker-5844d5c4cc-z7mbs" Mar 7 00:56:13.315905 kubelet[3252]: I0307 00:56:13.315686 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e626ef7-397b-4834-85cb-dac8d45cf32b-whisker-ca-bundle\") pod \"whisker-5844d5c4cc-z7mbs\" (UID: \"3e626ef7-397b-4834-85cb-dac8d45cf32b\") " pod="calico-system/whisker-5844d5c4cc-z7mbs" Mar 7 00:56:13.315905 kubelet[3252]: I0307 00:56:13.315727 3252 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/3e626ef7-397b-4834-85cb-dac8d45cf32b-nginx-config\") pod \"whisker-5844d5c4cc-z7mbs\" (UID: \"3e626ef7-397b-4834-85cb-dac8d45cf32b\") " pod="calico-system/whisker-5844d5c4cc-z7mbs" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.246 [ERROR][4937] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.316 [INFO][4937] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0 calico-apiserver-7fc9dd7d97- calico-system 1d414626-8c38-4470-b252-d2b0f7ae78d8 919 0 2026-03-07 00:55:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fc9dd7d97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-23-166 calico-apiserver-7fc9dd7d97-2bt9t eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali7a0b8478fd5 [] [] }} ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.316 [INFO][4937] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.692 [INFO][4997] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" HandleID="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.764 [INFO][4997] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" HandleID="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000325ba0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"calico-apiserver-7fc9dd7d97-2bt9t", "timestamp":"2026-03-07 00:56:12.692467078 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000166000)} Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.764 [INFO][4997] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.764 [INFO][4997] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.765 [INFO][4997] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.774 [INFO][4997] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.795 [INFO][4997] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.847 [INFO][4997] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.872 [INFO][4997] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.888 [INFO][4997] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.888 [INFO][4997] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.895 [INFO][4997] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45 Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.910 [INFO][4997] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.934 [INFO][4997] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.2/26] block=192.168.93.0/26 handle="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.936 [INFO][4997] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.2/26] handle="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" host="ip-172-31-23-166" Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.936 [INFO][4997] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:13.449119 containerd[2022]: 2026-03-07 00:56:12.938 [INFO][4997] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.2/26] IPv6=[] ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" HandleID="k8s-pod-network.3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:12.988 [INFO][4937] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"1d414626-8c38-4470-b252-d2b0f7ae78d8", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"calico-apiserver-7fc9dd7d97-2bt9t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7a0b8478fd5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:12.989 [INFO][4937] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.2/32] ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:12.989 [INFO][4937] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a0b8478fd5 ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:13.112 [INFO][4937] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:13.120 [INFO][4937] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"1d414626-8c38-4470-b252-d2b0f7ae78d8", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45", Pod:"calico-apiserver-7fc9dd7d97-2bt9t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7a0b8478fd5", MAC:"72:57:2c:73:a2:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:13.452519 containerd[2022]: 2026-03-07 00:56:13.442 [INFO][4937] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2bt9t" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:13.531122 containerd[2022]: time="2026-03-07T00:56:13.530678170Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:13.531122 containerd[2022]: time="2026-03-07T00:56:13.530810470Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:13.531122 containerd[2022]: time="2026-03-07T00:56:13.530849950Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:13.532516 containerd[2022]: time="2026-03-07T00:56:13.531024478Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:13.614610 containerd[2022]: time="2026-03-07T00:56:13.614528626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5844d5c4cc-z7mbs,Uid:3e626ef7-397b-4834-85cb-dac8d45cf32b,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:13.630799 systemd[1]: Started cri-containerd-3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45.scope - libcontainer container 3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45. Mar 7 00:56:13.757739 systemd-networkd[1942]: calice04295ffc4: Link UP Mar 7 00:56:13.763330 systemd-networkd[1942]: calice04295ffc4: Gained carrier Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:12.372 [ERROR][4892] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:12.453 [INFO][4892] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0 goldmane-9f7667bb8- calico-system 6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0 914 0 2026-03-07 00:55:47 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-23-166 goldmane-9f7667bb8-plrrb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calice04295ffc4 [] [] }} ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:12.453 [INFO][4892] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:12.879 [INFO][5004] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" HandleID="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.001 [INFO][5004] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" HandleID="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dc60), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"goldmane-9f7667bb8-plrrb", "timestamp":"2026-03-07 00:56:12.879059663 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400022e580)} Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.004 [INFO][5004] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.004 [INFO][5004] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.004 [INFO][5004] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.033 [INFO][5004] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.343 [INFO][5004] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.544 [INFO][5004] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.572 [INFO][5004] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.625 [INFO][5004] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.625 [INFO][5004] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.651 [INFO][5004] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.689 [INFO][5004] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.734 [INFO][5004] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.3/26] block=192.168.93.0/26 handle="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.734 [INFO][5004] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.3/26] handle="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" host="ip-172-31-23-166" Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.734 [INFO][5004] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:13.836888 containerd[2022]: 2026-03-07 00:56:13.734 [INFO][5004] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.3/26] IPv6=[] ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" HandleID="k8s-pod-network.a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.743 [INFO][4892] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"goldmane-9f7667bb8-plrrb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.93.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calice04295ffc4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.743 [INFO][4892] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.3/32] ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.743 [INFO][4892] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calice04295ffc4 ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.768 [INFO][4892] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.769 [INFO][4892] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b", Pod:"goldmane-9f7667bb8-plrrb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.93.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calice04295ffc4", MAC:"e6:4d:c1:84:f6:65", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:13.838264 containerd[2022]: 2026-03-07 00:56:13.830 [INFO][4892] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b" Namespace="calico-system" Pod="goldmane-9f7667bb8-plrrb" WorkloadEndpoint="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:13.939132 containerd[2022]: time="2026-03-07T00:56:13.935252088Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:13.939132 containerd[2022]: time="2026-03-07T00:56:13.935365020Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:13.939132 containerd[2022]: time="2026-03-07T00:56:13.936190560Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:13.941065 containerd[2022]: time="2026-03-07T00:56:13.939874884Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:13.995863 systemd-networkd[1942]: calid0374fbff20: Link UP Mar 7 00:56:14.004795 systemd-networkd[1942]: calid0374fbff20: Gained carrier Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:12.448 [ERROR][4901] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:12.579 [INFO][4901] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0 calico-apiserver-7fc9dd7d97- calico-system 6d62a8f1-aabe-4464-9392-047b694f64ae 921 0 2026-03-07 00:55:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fc9dd7d97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-23-166 calico-apiserver-7fc9dd7d97-2zgnf eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calid0374fbff20 [] [] }} ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:12.579 [INFO][4901] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.034 [INFO][5011] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" HandleID="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.415 [INFO][5011] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" HandleID="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000272980), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"calico-apiserver-7fc9dd7d97-2zgnf", "timestamp":"2026-03-07 00:56:13.034465832 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001b0f20)} Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.415 [INFO][5011] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.735 [INFO][5011] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.735 [INFO][5011] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.780 [INFO][5011] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.844 [INFO][5011] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.867 [INFO][5011] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.875 [INFO][5011] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.883 [INFO][5011] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.886 [INFO][5011] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.902 [INFO][5011] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202 Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.914 [INFO][5011] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.938 [INFO][5011] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.4/26] block=192.168.93.0/26 handle="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.938 [INFO][5011] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.4/26] handle="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" host="ip-172-31-23-166" Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.942 [INFO][5011] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:14.078594 containerd[2022]: 2026-03-07 00:56:13.943 [INFO][5011] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.4/26] IPv6=[] ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" HandleID="k8s-pod-network.5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:13.965 [INFO][4901] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"6d62a8f1-aabe-4464-9392-047b694f64ae", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"calico-apiserver-7fc9dd7d97-2zgnf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid0374fbff20", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:13.965 [INFO][4901] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.4/32] ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:13.965 [INFO][4901] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0374fbff20 ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:14.017 [INFO][4901] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:14.021 [INFO][4901] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"6d62a8f1-aabe-4464-9392-047b694f64ae", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202", Pod:"calico-apiserver-7fc9dd7d97-2zgnf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid0374fbff20", MAC:"32:31:1e:7f:c0:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.079842 containerd[2022]: 2026-03-07 00:56:14.055 [INFO][4901] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202" Namespace="calico-system" Pod="calico-apiserver-7fc9dd7d97-2zgnf" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:14.083705 systemd[1]: run-containerd-runc-k8s.io-a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b-runc.vnE7Y9.mount: Deactivated successfully. Mar 7 00:56:14.183085 systemd[1]: Started cri-containerd-a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b.scope - libcontainer container a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b. Mar 7 00:56:14.212518 systemd-networkd[1942]: cali512e261ff74: Link UP Mar 7 00:56:14.246364 systemd-networkd[1942]: cali512e261ff74: Gained carrier Mar 7 00:56:14.290522 containerd[2022]: time="2026-03-07T00:56:14.285866890Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:14.290522 containerd[2022]: time="2026-03-07T00:56:14.285965866Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:14.290522 containerd[2022]: time="2026-03-07T00:56:14.286005694Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.290522 containerd[2022]: time="2026-03-07T00:56:14.286167274Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:12.549 [ERROR][4951] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:12.667 [INFO][4951] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0 calico-kube-controllers-7dd69967b5- calico-system 695f6498-15b6-4241-b99a-44f0028694da 918 0 2026-03-07 00:55:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7dd69967b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-23-166 calico-kube-controllers-7dd69967b5-wbk4v eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali512e261ff74 [] [] }} ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:12.667 [INFO][4951] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.181 [INFO][5031] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" HandleID="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.428 [INFO][5031] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" HandleID="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003979c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"calico-kube-controllers-7dd69967b5-wbk4v", "timestamp":"2026-03-07 00:56:13.181328072 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000462c60)} Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.428 [INFO][5031] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.939 [INFO][5031] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.942 [INFO][5031] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:13.953 [INFO][5031] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.024 [INFO][5031] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.067 [INFO][5031] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.086 [INFO][5031] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.097 [INFO][5031] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.101 [INFO][5031] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.107 [INFO][5031] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.116 [INFO][5031] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.139 [INFO][5031] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.5/26] block=192.168.93.0/26 handle="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.140 [INFO][5031] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.5/26] handle="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" host="ip-172-31-23-166" Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.140 [INFO][5031] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:14.327692 containerd[2022]: 2026-03-07 00:56:14.142 [INFO][5031] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.5/26] IPv6=[] ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" HandleID="k8s-pod-network.6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.153 [INFO][4951] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0", GenerateName:"calico-kube-controllers-7dd69967b5-", Namespace:"calico-system", SelfLink:"", UID:"695f6498-15b6-4241-b99a-44f0028694da", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd69967b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"calico-kube-controllers-7dd69967b5-wbk4v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.93.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali512e261ff74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.160 [INFO][4951] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.5/32] ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.160 [INFO][4951] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali512e261ff74 ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.254 [INFO][4951] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.255 [INFO][4951] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0", GenerateName:"calico-kube-controllers-7dd69967b5-", Namespace:"calico-system", SelfLink:"", UID:"695f6498-15b6-4241-b99a-44f0028694da", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd69967b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad", Pod:"calico-kube-controllers-7dd69967b5-wbk4v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.93.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali512e261ff74", MAC:"fa:c9:7f:53:71:87", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.334749 containerd[2022]: 2026-03-07 00:56:14.298 [INFO][4951] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad" Namespace="calico-system" Pod="calico-kube-controllers-7dd69967b5-wbk4v" WorkloadEndpoint="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:14.368576 kubelet[3252]: I0307 00:56:14.367786 3252 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="56938f35-7cb7-4754-962f-9b675f12bee1" path="/var/lib/kubelet/pods/56938f35-7cb7-4754-962f-9b675f12bee1/volumes" Mar 7 00:56:14.458645 systemd-networkd[1942]: califf0debc28c7: Link UP Mar 7 00:56:14.463092 systemd[1]: Started cri-containerd-5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202.scope - libcontainer container 5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202. Mar 7 00:56:14.465994 systemd-networkd[1942]: califf0debc28c7: Gained carrier Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:12.632 [ERROR][4969] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:12.675 [INFO][4969] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0 coredns-7d764666f9- kube-system 6a33fd83-0f26-4bf6-bbf7-f61c96127d50 913 0 2026-03-07 00:55:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-23-166 coredns-7d764666f9-xg55x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califf0debc28c7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:12.676 [INFO][4969] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:13.121 [INFO][5032] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" HandleID="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:13.442 [INFO][5032] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" HandleID="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000362940), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-23-166", "pod":"coredns-7d764666f9-xg55x", "timestamp":"2026-03-07 00:56:13.12189218 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000186580)} Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:13.445 [INFO][5032] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.143 [INFO][5032] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.143 [INFO][5032] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.153 [INFO][5032] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.174 [INFO][5032] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.255 [INFO][5032] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.265 [INFO][5032] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.280 [INFO][5032] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.300 [INFO][5032] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.308 [INFO][5032] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63 Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.352 [INFO][5032] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.381 [INFO][5032] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.6/26] block=192.168.93.0/26 handle="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.383 [INFO][5032] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.6/26] handle="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" host="ip-172-31-23-166" Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.385 [INFO][5032] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:14.541141 containerd[2022]: 2026-03-07 00:56:14.385 [INFO][5032] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.6/26] IPv6=[] ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" HandleID="k8s-pod-network.878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.416 [INFO][4969] cni-plugin/k8s.go 418: Populated endpoint ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6a33fd83-0f26-4bf6-bbf7-f61c96127d50", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"coredns-7d764666f9-xg55x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf0debc28c7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.416 [INFO][4969] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.6/32] ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.416 [INFO][4969] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califf0debc28c7 ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.477 [INFO][4969] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.483 [INFO][4969] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6a33fd83-0f26-4bf6-bbf7-f61c96127d50", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63", Pod:"coredns-7d764666f9-xg55x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf0debc28c7", MAC:"b2:ef:22:c3:08:5c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.544246 containerd[2022]: 2026-03-07 00:56:14.531 [INFO][4969] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63" Namespace="kube-system" Pod="coredns-7d764666f9-xg55x" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:14.558078 containerd[2022]: time="2026-03-07T00:56:14.555331439Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:14.558627 containerd[2022]: time="2026-03-07T00:56:14.558518675Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:14.559367 containerd[2022]: time="2026-03-07T00:56:14.559193483Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.565542 containerd[2022]: time="2026-03-07T00:56:14.564609959Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.567336 containerd[2022]: time="2026-03-07T00:56:14.567254315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2bt9t,Uid:1d414626-8c38-4470-b252-d2b0f7ae78d8,Namespace:calico-system,Attempt:1,} returns sandbox id \"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45\"" Mar 7 00:56:14.607675 systemd-networkd[1942]: calida70c04e092: Link UP Mar 7 00:56:14.616516 systemd-networkd[1942]: calida70c04e092: Gained carrier Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:12.764 [ERROR][4984] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:12.940 [INFO][4984] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0 coredns-7d764666f9- kube-system d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9 920 0 2026-03-07 00:55:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-23-166 coredns-7d764666f9-xw2lb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calida70c04e092 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:12.943 [INFO][4984] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:13.283 [INFO][5069] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" HandleID="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:13.498 [INFO][5069] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" HandleID="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000391e10), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-23-166", "pod":"coredns-7d764666f9-xw2lb", "timestamp":"2026-03-07 00:56:13.283464009 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003882c0)} Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:13.501 [INFO][5069] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.385 [INFO][5069] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.387 [INFO][5069] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.393 [INFO][5069] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.412 [INFO][5069] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.476 [INFO][5069] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.482 [INFO][5069] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.508 [INFO][5069] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.508 [INFO][5069] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.515 [INFO][5069] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.545 [INFO][5069] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.568 [INFO][5069] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.7/26] block=192.168.93.0/26 handle="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.568 [INFO][5069] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.7/26] handle="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" host="ip-172-31-23-166" Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.568 [INFO][5069] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:14.680953 containerd[2022]: 2026-03-07 00:56:14.569 [INFO][5069] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.7/26] IPv6=[] ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" HandleID="k8s-pod-network.746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.586 [INFO][4984] cni-plugin/k8s.go 418: Populated endpoint ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"coredns-7d764666f9-xw2lb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida70c04e092", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.590 [INFO][4984] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.7/32] ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.590 [INFO][4984] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida70c04e092 ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.615 [INFO][4984] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.625 [INFO][4984] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a", Pod:"coredns-7d764666f9-xw2lb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida70c04e092", MAC:"2e:55:95:e1:dc:0f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.682159 containerd[2022]: 2026-03-07 00:56:14.672 [INFO][4984] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a" Namespace="kube-system" Pod="coredns-7d764666f9-xw2lb" WorkloadEndpoint="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:14.693652 containerd[2022]: time="2026-03-07T00:56:14.692880348Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:14.693652 containerd[2022]: time="2026-03-07T00:56:14.692992476Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:14.693652 containerd[2022]: time="2026-03-07T00:56:14.693062700Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.693652 containerd[2022]: time="2026-03-07T00:56:14.693335712Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.744191 systemd[1]: Started cri-containerd-6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad.scope - libcontainer container 6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad. Mar 7 00:56:14.763974 systemd[1]: Started cri-containerd-878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63.scope - libcontainer container 878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63. Mar 7 00:56:14.844516 systemd-networkd[1942]: cali1a97afa8cfe: Link UP Mar 7 00:56:14.846752 systemd-networkd[1942]: cali1a97afa8cfe: Gained carrier Mar 7 00:56:14.887971 containerd[2022]: time="2026-03-07T00:56:14.887872465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-plrrb,Uid:6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0,Namespace:calico-system,Attempt:1,} returns sandbox id \"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b\"" Mar 7 00:56:14.914823 containerd[2022]: time="2026-03-07T00:56:14.914231881Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:14.914823 containerd[2022]: time="2026-03-07T00:56:14.914350513Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:14.914823 containerd[2022]: time="2026-03-07T00:56:14.914417473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.915503 containerd[2022]: time="2026-03-07T00:56:14.915223477Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:13.735 [ERROR][5170] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:13.849 [INFO][5170] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0 whisker-5844d5c4cc- calico-system 3e626ef7-397b-4834-85cb-dac8d45cf32b 946 0 2026-03-07 00:56:13 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5844d5c4cc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-23-166 whisker-5844d5c4cc-z7mbs eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1a97afa8cfe [] [] }} ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:13.851 [INFO][5170] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.205 [INFO][5201] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" HandleID="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Workload="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.271 [INFO][5201] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" HandleID="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Workload="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004e61b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-166", "pod":"whisker-5844d5c4cc-z7mbs", "timestamp":"2026-03-07 00:56:14.205947429 +0000 UTC"}, Hostname:"ip-172-31-23-166", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400018e580)} Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.271 [INFO][5201] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.570 [INFO][5201] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.570 [INFO][5201] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-166' Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.577 [INFO][5201] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.603 [INFO][5201] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.638 [INFO][5201] ipam/ipam.go 526: Trying affinity for 192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.656 [INFO][5201] ipam/ipam.go 160: Attempting to load block cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.673 [INFO][5201] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.93.0/26 host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.676 [INFO][5201] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.93.0/26 handle="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.685 [INFO][5201] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1 Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.704 [INFO][5201] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.93.0/26 handle="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.753 [INFO][5201] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.93.8/26] block=192.168.93.0/26 handle="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.754 [INFO][5201] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.93.8/26] handle="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" host="ip-172-31-23-166" Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.754 [INFO][5201] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:14.917037 containerd[2022]: 2026-03-07 00:56:14.754 [INFO][5201] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.93.8/26] IPv6=[] ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" HandleID="k8s-pod-network.7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Workload="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.822 [INFO][5170] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0", GenerateName:"whisker-5844d5c4cc-", Namespace:"calico-system", SelfLink:"", UID:"3e626ef7-397b-4834-85cb-dac8d45cf32b", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5844d5c4cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"", Pod:"whisker-5844d5c4cc-z7mbs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.93.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a97afa8cfe", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.823 [INFO][5170] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.93.8/32] ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.824 [INFO][5170] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a97afa8cfe ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.850 [INFO][5170] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.869 [INFO][5170] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0", GenerateName:"whisker-5844d5c4cc-", Namespace:"calico-system", SelfLink:"", UID:"3e626ef7-397b-4834-85cb-dac8d45cf32b", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5844d5c4cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1", Pod:"whisker-5844d5c4cc-z7mbs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.93.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a97afa8cfe", MAC:"22:f3:f5:da:6c:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:14.918921 containerd[2022]: 2026-03-07 00:56:14.901 [INFO][5170] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1" Namespace="calico-system" Pod="whisker-5844d5c4cc-z7mbs" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--5844d5c4cc--z7mbs-eth0" Mar 7 00:56:14.970486 systemd-networkd[1942]: cali7a0b8478fd5: Gained IPv6LL Mar 7 00:56:15.034712 systemd[1]: Started cri-containerd-746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a.scope - libcontainer container 746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a. Mar 7 00:56:15.074458 containerd[2022]: time="2026-03-07T00:56:15.074120182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xg55x,Uid:6a33fd83-0f26-4bf6-bbf7-f61c96127d50,Namespace:kube-system,Attempt:1,} returns sandbox id \"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63\"" Mar 7 00:56:15.101136 containerd[2022]: time="2026-03-07T00:56:15.100734886Z" level=info msg="CreateContainer within sandbox \"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:56:15.188365 containerd[2022]: time="2026-03-07T00:56:15.186977602Z" level=info msg="CreateContainer within sandbox \"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b2a222cee2f30cd795659cb12101399f7a00000a85d2066a2da0f65be7b7f298\"" Mar 7 00:56:15.191635 containerd[2022]: time="2026-03-07T00:56:15.187757218Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:15.191635 containerd[2022]: time="2026-03-07T00:56:15.187877398Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:15.191635 containerd[2022]: time="2026-03-07T00:56:15.187914574Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:15.191635 containerd[2022]: time="2026-03-07T00:56:15.188101822Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:15.195434 containerd[2022]: time="2026-03-07T00:56:15.194783626Z" level=info msg="StartContainer for \"b2a222cee2f30cd795659cb12101399f7a00000a85d2066a2da0f65be7b7f298\"" Mar 7 00:56:15.213258 containerd[2022]: time="2026-03-07T00:56:15.212074726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fc9dd7d97-2zgnf,Uid:6d62a8f1-aabe-4464-9392-047b694f64ae,Namespace:calico-system,Attempt:1,} returns sandbox id \"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202\"" Mar 7 00:56:15.270792 containerd[2022]: time="2026-03-07T00:56:15.270705167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xw2lb,Uid:d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9,Namespace:kube-system,Attempt:1,} returns sandbox id \"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a\"" Mar 7 00:56:15.291057 containerd[2022]: time="2026-03-07T00:56:15.290828591Z" level=info msg="CreateContainer within sandbox \"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:56:15.335074 systemd[1]: Started cri-containerd-7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1.scope - libcontainer container 7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1. Mar 7 00:56:15.370909 containerd[2022]: time="2026-03-07T00:56:15.369697379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7dd69967b5-wbk4v,Uid:695f6498-15b6-4241-b99a-44f0028694da,Namespace:calico-system,Attempt:1,} returns sandbox id \"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad\"" Mar 7 00:56:15.416902 systemd-networkd[1942]: calice04295ffc4: Gained IPv6LL Mar 7 00:56:15.447776 containerd[2022]: time="2026-03-07T00:56:15.442771524Z" level=info msg="CreateContainer within sandbox \"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9ed5c0e18dc50158867e98506ae83798cb8860fd7c5f18b49ba4820429450cfd\"" Mar 7 00:56:15.457418 containerd[2022]: time="2026-03-07T00:56:15.455906088Z" level=info msg="StartContainer for \"9ed5c0e18dc50158867e98506ae83798cb8860fd7c5f18b49ba4820429450cfd\"" Mar 7 00:56:15.468178 systemd[1]: Started cri-containerd-b2a222cee2f30cd795659cb12101399f7a00000a85d2066a2da0f65be7b7f298.scope - libcontainer container b2a222cee2f30cd795659cb12101399f7a00000a85d2066a2da0f65be7b7f298. Mar 7 00:56:15.609059 systemd-networkd[1942]: calid0374fbff20: Gained IPv6LL Mar 7 00:56:15.642685 systemd[1]: Started cri-containerd-9ed5c0e18dc50158867e98506ae83798cb8860fd7c5f18b49ba4820429450cfd.scope - libcontainer container 9ed5c0e18dc50158867e98506ae83798cb8860fd7c5f18b49ba4820429450cfd. Mar 7 00:56:15.672727 systemd-networkd[1942]: calida70c04e092: Gained IPv6LL Mar 7 00:56:15.692996 containerd[2022]: time="2026-03-07T00:56:15.692422465Z" level=info msg="StartContainer for \"b2a222cee2f30cd795659cb12101399f7a00000a85d2066a2da0f65be7b7f298\" returns successfully" Mar 7 00:56:15.814104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2544106245.mount: Deactivated successfully. Mar 7 00:56:15.883436 containerd[2022]: time="2026-03-07T00:56:15.882795626Z" level=info msg="StartContainer for \"9ed5c0e18dc50158867e98506ae83798cb8860fd7c5f18b49ba4820429450cfd\" returns successfully" Mar 7 00:56:15.941904 containerd[2022]: time="2026-03-07T00:56:15.941843330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5844d5c4cc-z7mbs,Uid:3e626ef7-397b-4834-85cb-dac8d45cf32b,Namespace:calico-system,Attempt:0,} returns sandbox id \"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1\"" Mar 7 00:56:16.057613 containerd[2022]: time="2026-03-07T00:56:16.055457675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:16.059119 systemd-networkd[1942]: califf0debc28c7: Gained IPv6LL Mar 7 00:56:16.065488 containerd[2022]: time="2026-03-07T00:56:16.064948979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 7 00:56:16.075280 containerd[2022]: time="2026-03-07T00:56:16.073623419Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:16.087671 containerd[2022]: time="2026-03-07T00:56:16.087298883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:16.092450 containerd[2022]: time="2026-03-07T00:56:16.091862327Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 3.726001867s" Mar 7 00:56:16.092703 containerd[2022]: time="2026-03-07T00:56:16.092660675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 7 00:56:16.096122 containerd[2022]: time="2026-03-07T00:56:16.095881367Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:56:16.107638 containerd[2022]: time="2026-03-07T00:56:16.107553107Z" level=info msg="CreateContainer within sandbox \"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 00:56:16.141206 systemd[1]: Started sshd@8-172.31.23.166:22-20.161.92.111:58342.service - OpenSSH per-connection server daemon (20.161.92.111:58342). Mar 7 00:56:16.157213 kubelet[3252]: I0307 00:56:16.153095 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-xg55x" podStartSLOduration=49.153056327 podStartE2EDuration="49.153056327s" podCreationTimestamp="2026-03-07 00:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:16.136228355 +0000 UTC m=+54.129443106" watchObservedRunningTime="2026-03-07 00:56:16.153056327 +0000 UTC m=+54.146270970" Mar 7 00:56:16.173734 containerd[2022]: time="2026-03-07T00:56:16.169992551Z" level=info msg="CreateContainer within sandbox \"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2\"" Mar 7 00:56:16.173734 containerd[2022]: time="2026-03-07T00:56:16.172083047Z" level=info msg="StartContainer for \"331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2\"" Mar 7 00:56:16.184700 systemd-networkd[1942]: cali512e261ff74: Gained IPv6LL Mar 7 00:56:16.309490 systemd[1]: run-containerd-runc-k8s.io-331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2-runc.BNqJND.mount: Deactivated successfully. Mar 7 00:56:16.328803 systemd[1]: Started cri-containerd-331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2.scope - libcontainer container 331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2. Mar 7 00:56:16.616088 containerd[2022]: time="2026-03-07T00:56:16.616010221Z" level=info msg="StartContainer for \"331d914dcfff2cec71406f134a9b4f247ea05697dc6a8dcb3b565476be4c6dc2\" returns successfully" Mar 7 00:56:16.738351 sshd[5569]: Accepted publickey for core from 20.161.92.111 port 58342 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:16.744821 sshd[5569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:16.760235 systemd-logind[2004]: New session 8 of user core. Mar 7 00:56:16.773748 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 00:56:16.889139 systemd-networkd[1942]: cali1a97afa8cfe: Gained IPv6LL Mar 7 00:56:17.097788 kubelet[3252]: I0307 00:56:17.095634 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-xw2lb" podStartSLOduration=50.0956037 podStartE2EDuration="50.0956037s" podCreationTimestamp="2026-03-07 00:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:16.360277356 +0000 UTC m=+54.353492035" watchObservedRunningTime="2026-03-07 00:56:17.0956037 +0000 UTC m=+55.088818343" Mar 7 00:56:17.278502 kernel: calico-node[5058]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 00:56:17.366735 sshd[5569]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:17.383875 systemd[1]: sshd@8-172.31.23.166:22-20.161.92.111:58342.service: Deactivated successfully. Mar 7 00:56:17.392648 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 00:56:17.397706 systemd-logind[2004]: Session 8 logged out. Waiting for processes to exit. Mar 7 00:56:17.402411 systemd-logind[2004]: Removed session 8. Mar 7 00:56:18.170241 (udev-worker)[4799]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:56:18.174651 systemd-networkd[1942]: vxlan.calico: Link UP Mar 7 00:56:18.174667 systemd-networkd[1942]: vxlan.calico: Gained carrier Mar 7 00:56:19.705005 systemd-networkd[1942]: vxlan.calico: Gained IPv6LL Mar 7 00:56:20.013357 containerd[2022]: time="2026-03-07T00:56:20.012929258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:20.015961 containerd[2022]: time="2026-03-07T00:56:20.015542210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 7 00:56:20.019500 containerd[2022]: time="2026-03-07T00:56:20.018170762Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:20.024433 containerd[2022]: time="2026-03-07T00:56:20.024271382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:20.026329 containerd[2022]: time="2026-03-07T00:56:20.026256290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.928911019s" Mar 7 00:56:20.026548 containerd[2022]: time="2026-03-07T00:56:20.026516570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:56:20.029160 containerd[2022]: time="2026-03-07T00:56:20.029055506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 00:56:20.039079 containerd[2022]: time="2026-03-07T00:56:20.039026114Z" level=info msg="CreateContainer within sandbox \"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:56:20.070777 containerd[2022]: time="2026-03-07T00:56:20.070692014Z" level=info msg="CreateContainer within sandbox \"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"35b8c4742e2ecfcccd3e5abd8791ca067560e76c11da2cdc102226916ad10d63\"" Mar 7 00:56:20.072710 containerd[2022]: time="2026-03-07T00:56:20.072639531Z" level=info msg="StartContainer for \"35b8c4742e2ecfcccd3e5abd8791ca067560e76c11da2cdc102226916ad10d63\"" Mar 7 00:56:20.203736 systemd[1]: Started cri-containerd-35b8c4742e2ecfcccd3e5abd8791ca067560e76c11da2cdc102226916ad10d63.scope - libcontainer container 35b8c4742e2ecfcccd3e5abd8791ca067560e76c11da2cdc102226916ad10d63. Mar 7 00:56:20.276167 containerd[2022]: time="2026-03-07T00:56:20.274284220Z" level=info msg="StartContainer for \"35b8c4742e2ecfcccd3e5abd8791ca067560e76c11da2cdc102226916ad10d63\" returns successfully" Mar 7 00:56:22.104747 kubelet[3252]: I0307 00:56:22.104407 3252 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:56:22.297270 containerd[2022]: time="2026-03-07T00:56:22.297210870Z" level=info msg="StopPodSandbox for \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\"" Mar 7 00:56:22.469234 ntpd[1999]: Listen normally on 7 vxlan.calico 192.168.93.0:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 7 vxlan.calico 192.168.93.0:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 8 cali929677d6584 [fe80::ecee:eeff:feee:eeee%4]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 9 cali7a0b8478fd5 [fe80::ecee:eeff:feee:eeee%5]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 10 calice04295ffc4 [fe80::ecee:eeff:feee:eeee%6]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 11 calid0374fbff20 [fe80::ecee:eeff:feee:eeee%7]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 12 cali512e261ff74 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 13 califf0debc28c7 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 14 calida70c04e092 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 15 cali1a97afa8cfe [fe80::ecee:eeff:feee:eeee%11]:123 Mar 7 00:56:22.471071 ntpd[1999]: 7 Mar 00:56:22 ntpd[1999]: Listen normally on 16 vxlan.calico [fe80::64bd:fdff:fede:8b47%12]:123 Mar 7 00:56:22.470906 systemd[1]: Started sshd@9-172.31.23.166:22-20.161.92.111:41548.service - OpenSSH per-connection server daemon (20.161.92.111:41548). Mar 7 00:56:22.469375 ntpd[1999]: Listen normally on 8 cali929677d6584 [fe80::ecee:eeff:feee:eeee%4]:123 Mar 7 00:56:22.469526 ntpd[1999]: Listen normally on 9 cali7a0b8478fd5 [fe80::ecee:eeff:feee:eeee%5]:123 Mar 7 00:56:22.469627 ntpd[1999]: Listen normally on 10 calice04295ffc4 [fe80::ecee:eeff:feee:eeee%6]:123 Mar 7 00:56:22.469702 ntpd[1999]: Listen normally on 11 calid0374fbff20 [fe80::ecee:eeff:feee:eeee%7]:123 Mar 7 00:56:22.469774 ntpd[1999]: Listen normally on 12 cali512e261ff74 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 7 00:56:22.469853 ntpd[1999]: Listen normally on 13 califf0debc28c7 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 7 00:56:22.469921 ntpd[1999]: Listen normally on 14 calida70c04e092 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 7 00:56:22.469990 ntpd[1999]: Listen normally on 15 cali1a97afa8cfe [fe80::ecee:eeff:feee:eeee%11]:123 Mar 7 00:56:22.470061 ntpd[1999]: Listen normally on 16 vxlan.calico [fe80::64bd:fdff:fede:8b47%12]:123 Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.448 [WARNING][5829] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"1d414626-8c38-4470-b252-d2b0f7ae78d8", ResourceVersion:"1068", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45", Pod:"calico-apiserver-7fc9dd7d97-2bt9t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7a0b8478fd5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.451 [INFO][5829] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.451 [INFO][5829] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" iface="eth0" netns="" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.451 [INFO][5829] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.451 [INFO][5829] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.596 [INFO][5839] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.599 [INFO][5839] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.600 [INFO][5839] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.624 [WARNING][5839] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.624 [INFO][5839] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.628 [INFO][5839] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:22.639010 containerd[2022]: 2026-03-07 00:56:22.633 [INFO][5829] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.641192 containerd[2022]: time="2026-03-07T00:56:22.639074923Z" level=info msg="TearDown network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" successfully" Mar 7 00:56:22.641192 containerd[2022]: time="2026-03-07T00:56:22.639117199Z" level=info msg="StopPodSandbox for \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" returns successfully" Mar 7 00:56:22.642621 containerd[2022]: time="2026-03-07T00:56:22.642546739Z" level=info msg="RemovePodSandbox for \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\"" Mar 7 00:56:22.642772 containerd[2022]: time="2026-03-07T00:56:22.642655267Z" level=info msg="Forcibly stopping sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\"" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.764 [WARNING][5856] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"1d414626-8c38-4470-b252-d2b0f7ae78d8", ResourceVersion:"1068", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"3af8f3e72f9fe769c689a7261bd54ead13ab0ef7e5bdd0d3f9004ba44cd59a45", Pod:"calico-apiserver-7fc9dd7d97-2bt9t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7a0b8478fd5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.765 [INFO][5856] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.765 [INFO][5856] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" iface="eth0" netns="" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.765 [INFO][5856] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.765 [INFO][5856] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.832 [INFO][5863] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.834 [INFO][5863] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.835 [INFO][5863] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.856 [WARNING][5863] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.856 [INFO][5863] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" HandleID="k8s-pod-network.62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2bt9t-eth0" Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.860 [INFO][5863] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:22.878343 containerd[2022]: 2026-03-07 00:56:22.866 [INFO][5856] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074" Mar 7 00:56:22.880582 containerd[2022]: time="2026-03-07T00:56:22.880371032Z" level=info msg="TearDown network for sandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" successfully" Mar 7 00:56:22.895028 containerd[2022]: time="2026-03-07T00:56:22.894950925Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:22.895440 containerd[2022]: time="2026-03-07T00:56:22.895237677Z" level=info msg="RemovePodSandbox \"62a821c582faa001aa335bc5487e8ea13dcb899f5fab941fbcb1b4caa1bb3074\" returns successfully" Mar 7 00:56:22.896309 containerd[2022]: time="2026-03-07T00:56:22.896240289Z" level=info msg="StopPodSandbox for \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\"" Mar 7 00:56:23.038134 sshd[5843]: Accepted publickey for core from 20.161.92.111 port 41548 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:23.043189 sshd[5843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:23.064788 systemd-logind[2004]: New session 9 of user core. Mar 7 00:56:23.075111 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.005 [WARNING][5877] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0", GenerateName:"calico-kube-controllers-7dd69967b5-", Namespace:"calico-system", SelfLink:"", UID:"695f6498-15b6-4241-b99a-44f0028694da", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd69967b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad", Pod:"calico-kube-controllers-7dd69967b5-wbk4v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.93.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali512e261ff74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.006 [INFO][5877] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.006 [INFO][5877] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" iface="eth0" netns="" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.007 [INFO][5877] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.007 [INFO][5877] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.101 [INFO][5884] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.101 [INFO][5884] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.101 [INFO][5884] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.181 [WARNING][5884] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.182 [INFO][5884] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.188 [INFO][5884] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:23.214781 containerd[2022]: 2026-03-07 00:56:23.199 [INFO][5877] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.214781 containerd[2022]: time="2026-03-07T00:56:23.213324306Z" level=info msg="TearDown network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" successfully" Mar 7 00:56:23.214781 containerd[2022]: time="2026-03-07T00:56:23.213361830Z" level=info msg="StopPodSandbox for \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" returns successfully" Mar 7 00:56:23.215938 containerd[2022]: time="2026-03-07T00:56:23.215363118Z" level=info msg="RemovePodSandbox for \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\"" Mar 7 00:56:23.215938 containerd[2022]: time="2026-03-07T00:56:23.215442246Z" level=info msg="Forcibly stopping sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\"" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.445 [WARNING][5901] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0", GenerateName:"calico-kube-controllers-7dd69967b5-", Namespace:"calico-system", SelfLink:"", UID:"695f6498-15b6-4241-b99a-44f0028694da", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7dd69967b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad", Pod:"calico-kube-controllers-7dd69967b5-wbk4v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.93.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali512e261ff74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.449 [INFO][5901] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.449 [INFO][5901] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" iface="eth0" netns="" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.449 [INFO][5901] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.449 [INFO][5901] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.575 [INFO][5916] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.575 [INFO][5916] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.575 [INFO][5916] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.615 [WARNING][5916] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.615 [INFO][5916] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" HandleID="k8s-pod-network.2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Workload="ip--172--31--23--166-k8s-calico--kube--controllers--7dd69967b5--wbk4v-eth0" Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.621 [INFO][5916] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:23.631822 containerd[2022]: 2026-03-07 00:56:23.627 [INFO][5901] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c" Mar 7 00:56:23.634055 containerd[2022]: time="2026-03-07T00:56:23.631908068Z" level=info msg="TearDown network for sandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" successfully" Mar 7 00:56:23.642464 containerd[2022]: time="2026-03-07T00:56:23.642189512Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:23.642671 containerd[2022]: time="2026-03-07T00:56:23.642495740Z" level=info msg="RemovePodSandbox \"2f212caca36c206556ee32a3643926dd56f4b1159fce3375385d059d76f8e35c\" returns successfully" Mar 7 00:56:23.643867 containerd[2022]: time="2026-03-07T00:56:23.643432748Z" level=info msg="StopPodSandbox for \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\"" Mar 7 00:56:23.679921 sshd[5843]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:23.697048 systemd[1]: sshd@9-172.31.23.166:22-20.161.92.111:41548.service: Deactivated successfully. Mar 7 00:56:23.711091 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 00:56:23.718284 systemd-logind[2004]: Session 9 logged out. Waiting for processes to exit. Mar 7 00:56:23.723503 systemd-logind[2004]: Removed session 9. Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.778 [WARNING][5930] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"6d62a8f1-aabe-4464-9392-047b694f64ae", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202", Pod:"calico-apiserver-7fc9dd7d97-2zgnf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid0374fbff20", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.778 [INFO][5930] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.778 [INFO][5930] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" iface="eth0" netns="" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.779 [INFO][5930] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.779 [INFO][5930] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.832 [INFO][5940] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.835 [INFO][5940] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.835 [INFO][5940] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.854 [WARNING][5940] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.854 [INFO][5940] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.858 [INFO][5940] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:23.866749 containerd[2022]: 2026-03-07 00:56:23.862 [INFO][5930] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:23.866749 containerd[2022]: time="2026-03-07T00:56:23.866515665Z" level=info msg="TearDown network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" successfully" Mar 7 00:56:23.866749 containerd[2022]: time="2026-03-07T00:56:23.866553753Z" level=info msg="StopPodSandbox for \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" returns successfully" Mar 7 00:56:23.870604 containerd[2022]: time="2026-03-07T00:56:23.868451313Z" level=info msg="RemovePodSandbox for \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\"" Mar 7 00:56:23.870604 containerd[2022]: time="2026-03-07T00:56:23.868523841Z" level=info msg="Forcibly stopping sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\"" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:23.959 [WARNING][5955] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0", GenerateName:"calico-apiserver-7fc9dd7d97-", Namespace:"calico-system", SelfLink:"", UID:"6d62a8f1-aabe-4464-9392-047b694f64ae", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fc9dd7d97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202", Pod:"calico-apiserver-7fc9dd7d97-2zgnf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.93.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid0374fbff20", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:23.960 [INFO][5955] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:23.960 [INFO][5955] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" iface="eth0" netns="" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:23.960 [INFO][5955] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:23.960 [INFO][5955] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.015 [INFO][5962] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.015 [INFO][5962] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.015 [INFO][5962] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.030 [WARNING][5962] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.031 [INFO][5962] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" HandleID="k8s-pod-network.092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Workload="ip--172--31--23--166-k8s-calico--apiserver--7fc9dd7d97--2zgnf-eth0" Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.033 [INFO][5962] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.040823 containerd[2022]: 2026-03-07 00:56:24.037 [INFO][5955] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e" Mar 7 00:56:24.042776 containerd[2022]: time="2026-03-07T00:56:24.042569118Z" level=info msg="TearDown network for sandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" successfully" Mar 7 00:56:24.051816 containerd[2022]: time="2026-03-07T00:56:24.051736590Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:24.052298 containerd[2022]: time="2026-03-07T00:56:24.051854106Z" level=info msg="RemovePodSandbox \"092946ccf9755f84047bf171b8d66fc53b39b013e703fc4cfe0e8cb319bc312e\" returns successfully" Mar 7 00:56:24.052753 containerd[2022]: time="2026-03-07T00:56:24.052636674Z" level=info msg="StopPodSandbox for \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\"" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.197 [WARNING][5978] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b", Pod:"goldmane-9f7667bb8-plrrb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.93.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calice04295ffc4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.198 [INFO][5978] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.198 [INFO][5978] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" iface="eth0" netns="" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.198 [INFO][5978] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.198 [INFO][5978] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.249 [INFO][5995] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.249 [INFO][5995] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.250 [INFO][5995] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.266 [WARNING][5995] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.266 [INFO][5995] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.270 [INFO][5995] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.278318 containerd[2022]: 2026-03-07 00:56:24.274 [INFO][5978] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.279944 containerd[2022]: time="2026-03-07T00:56:24.278340427Z" level=info msg="TearDown network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" successfully" Mar 7 00:56:24.279944 containerd[2022]: time="2026-03-07T00:56:24.278401507Z" level=info msg="StopPodSandbox for \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" returns successfully" Mar 7 00:56:24.280209 containerd[2022]: time="2026-03-07T00:56:24.280130179Z" level=info msg="RemovePodSandbox for \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\"" Mar 7 00:56:24.280579 containerd[2022]: time="2026-03-07T00:56:24.280191547Z" level=info msg="Forcibly stopping sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\"" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.365 [WARNING][6010] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"6ae6eb1c-c1a0-42a2-9357-aea0ab8fecf0", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b", Pod:"goldmane-9f7667bb8-plrrb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.93.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calice04295ffc4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.365 [INFO][6010] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.366 [INFO][6010] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" iface="eth0" netns="" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.366 [INFO][6010] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.366 [INFO][6010] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.413 [INFO][6017] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.413 [INFO][6017] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.413 [INFO][6017] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.429 [WARNING][6017] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.429 [INFO][6017] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" HandleID="k8s-pod-network.7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Workload="ip--172--31--23--166-k8s-goldmane--9f7667bb8--plrrb-eth0" Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.432 [INFO][6017] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.439175 containerd[2022]: 2026-03-07 00:56:24.435 [INFO][6010] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb" Mar 7 00:56:24.440023 containerd[2022]: time="2026-03-07T00:56:24.439258880Z" level=info msg="TearDown network for sandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" successfully" Mar 7 00:56:24.446432 containerd[2022]: time="2026-03-07T00:56:24.446302556Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:24.446577 containerd[2022]: time="2026-03-07T00:56:24.446475992Z" level=info msg="RemovePodSandbox \"7ca7d41c7246ca07698d6f8262135ea3cbee58a59c34377d8c1382db9633cbeb\" returns successfully" Mar 7 00:56:24.448112 containerd[2022]: time="2026-03-07T00:56:24.448019948Z" level=info msg="StopPodSandbox for \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\"" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.528 [WARNING][6032] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.528 [INFO][6032] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.528 [INFO][6032] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" iface="eth0" netns="" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.528 [INFO][6032] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.528 [INFO][6032] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.573 [INFO][6039] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.573 [INFO][6039] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.573 [INFO][6039] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.588 [WARNING][6039] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.588 [INFO][6039] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.591 [INFO][6039] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.598920 containerd[2022]: 2026-03-07 00:56:24.595 [INFO][6032] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.598920 containerd[2022]: time="2026-03-07T00:56:24.598873953Z" level=info msg="TearDown network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" successfully" Mar 7 00:56:24.598920 containerd[2022]: time="2026-03-07T00:56:24.598914897Z" level=info msg="StopPodSandbox for \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" returns successfully" Mar 7 00:56:24.600778 containerd[2022]: time="2026-03-07T00:56:24.600459393Z" level=info msg="RemovePodSandbox for \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\"" Mar 7 00:56:24.601375 containerd[2022]: time="2026-03-07T00:56:24.600860445Z" level=info msg="Forcibly stopping sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\"" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.675 [WARNING][6053] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" WorkloadEndpoint="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.676 [INFO][6053] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.676 [INFO][6053] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" iface="eth0" netns="" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.676 [INFO][6053] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.676 [INFO][6053] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.735 [INFO][6060] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.736 [INFO][6060] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.736 [INFO][6060] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.751 [WARNING][6060] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.751 [INFO][6060] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" HandleID="k8s-pod-network.522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Workload="ip--172--31--23--166-k8s-whisker--7cb7dc8756--k6flf-eth0" Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.753 [INFO][6060] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.760609 containerd[2022]: 2026-03-07 00:56:24.756 [INFO][6053] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341" Mar 7 00:56:24.763888 containerd[2022]: time="2026-03-07T00:56:24.761675230Z" level=info msg="TearDown network for sandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" successfully" Mar 7 00:56:24.771232 containerd[2022]: time="2026-03-07T00:56:24.771087478Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:24.771407 containerd[2022]: time="2026-03-07T00:56:24.771244186Z" level=info msg="RemovePodSandbox \"522df2ee9f3979e57c2c729ca928c52378c38102942752508405182f55ef7341\" returns successfully" Mar 7 00:56:24.772498 containerd[2022]: time="2026-03-07T00:56:24.771952474Z" level=info msg="StopPodSandbox for \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\"" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.851 [WARNING][6074] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a", Pod:"coredns-7d764666f9-xw2lb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida70c04e092", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.852 [INFO][6074] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.852 [INFO][6074] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" iface="eth0" netns="" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.852 [INFO][6074] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.852 [INFO][6074] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.900 [INFO][6082] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.900 [INFO][6082] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.901 [INFO][6082] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.915 [WARNING][6082] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.915 [INFO][6082] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.918 [INFO][6082] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:24.925684 containerd[2022]: 2026-03-07 00:56:24.922 [INFO][6074] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:24.925684 containerd[2022]: time="2026-03-07T00:56:24.925478723Z" level=info msg="TearDown network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" successfully" Mar 7 00:56:24.925684 containerd[2022]: time="2026-03-07T00:56:24.925655087Z" level=info msg="StopPodSandbox for \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" returns successfully" Mar 7 00:56:24.926690 containerd[2022]: time="2026-03-07T00:56:24.926544899Z" level=info msg="RemovePodSandbox for \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\"" Mar 7 00:56:24.926690 containerd[2022]: time="2026-03-07T00:56:24.926596043Z" level=info msg="Forcibly stopping sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\"" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.004 [WARNING][6096] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d42faf4e-65f6-4396-aa30-a3b1cd2cc8f9", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"746b6e10bdc78369d888eb4f089bf86d8e639620c42c51b667347c2999a3956a", Pod:"coredns-7d764666f9-xw2lb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida70c04e092", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.005 [INFO][6096] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.005 [INFO][6096] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" iface="eth0" netns="" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.005 [INFO][6096] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.005 [INFO][6096] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.061 [INFO][6103] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.062 [INFO][6103] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.062 [INFO][6103] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.091 [WARNING][6103] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.091 [INFO][6103] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" HandleID="k8s-pod-network.c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xw2lb-eth0" Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.094 [INFO][6103] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:25.117728 containerd[2022]: 2026-03-07 00:56:25.109 [INFO][6096] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0" Mar 7 00:56:25.119576 containerd[2022]: time="2026-03-07T00:56:25.119012180Z" level=info msg="TearDown network for sandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" successfully" Mar 7 00:56:25.135140 containerd[2022]: time="2026-03-07T00:56:25.134537420Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:25.135140 containerd[2022]: time="2026-03-07T00:56:25.134633912Z" level=info msg="RemovePodSandbox \"c50b6735e77ed3a3af3cd73513ce497a423cd97b19a606798f84200f5e252bb0\" returns successfully" Mar 7 00:56:25.135461 containerd[2022]: time="2026-03-07T00:56:25.135337796Z" level=info msg="StopPodSandbox for \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\"" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.268 [WARNING][6121] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6a33fd83-0f26-4bf6-bbf7-f61c96127d50", ResourceVersion:"1030", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63", Pod:"coredns-7d764666f9-xg55x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf0debc28c7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.273 [INFO][6121] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.273 [INFO][6121] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" iface="eth0" netns="" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.273 [INFO][6121] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.273 [INFO][6121] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.361 [INFO][6129] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.361 [INFO][6129] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.361 [INFO][6129] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.385 [WARNING][6129] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.385 [INFO][6129] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.392 [INFO][6129] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:25.406581 containerd[2022]: 2026-03-07 00:56:25.397 [INFO][6121] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.408743 containerd[2022]: time="2026-03-07T00:56:25.406584597Z" level=info msg="TearDown network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" successfully" Mar 7 00:56:25.408743 containerd[2022]: time="2026-03-07T00:56:25.406626345Z" level=info msg="StopPodSandbox for \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" returns successfully" Mar 7 00:56:25.410154 containerd[2022]: time="2026-03-07T00:56:25.409263009Z" level=info msg="RemovePodSandbox for \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\"" Mar 7 00:56:25.410154 containerd[2022]: time="2026-03-07T00:56:25.409650705Z" level=info msg="Forcibly stopping sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\"" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.515 [WARNING][6144] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6a33fd83-0f26-4bf6-bbf7-f61c96127d50", ResourceVersion:"1030", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 55, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-166", ContainerID:"878c79028bc04f3d46d1c876b7677828e0ebf6de901f7ece335d1c3a9efeac63", Pod:"coredns-7d764666f9-xg55x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.93.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califf0debc28c7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.516 [INFO][6144] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.516 [INFO][6144] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" iface="eth0" netns="" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.516 [INFO][6144] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.516 [INFO][6144] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.583 [INFO][6151] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.583 [INFO][6151] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.583 [INFO][6151] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.603 [WARNING][6151] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.603 [INFO][6151] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" HandleID="k8s-pod-network.138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Workload="ip--172--31--23--166-k8s-coredns--7d764666f9--xg55x-eth0" Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.606 [INFO][6151] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:25.618573 containerd[2022]: 2026-03-07 00:56:25.613 [INFO][6144] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42" Mar 7 00:56:25.620151 containerd[2022]: time="2026-03-07T00:56:25.618725470Z" level=info msg="TearDown network for sandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" successfully" Mar 7 00:56:25.633045 containerd[2022]: time="2026-03-07T00:56:25.632835082Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:56:25.633045 containerd[2022]: time="2026-03-07T00:56:25.632933566Z" level=info msg="RemovePodSandbox \"138f948a38d5a224115135325d94ae632970882d6c255b71ab50ba652994ef42\" returns successfully" Mar 7 00:56:25.961221 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1069206393.mount: Deactivated successfully. Mar 7 00:56:26.596082 containerd[2022]: time="2026-03-07T00:56:26.595997075Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:26.600268 containerd[2022]: time="2026-03-07T00:56:26.600198815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 7 00:56:26.602051 containerd[2022]: time="2026-03-07T00:56:26.601966559Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:26.609972 containerd[2022]: time="2026-03-07T00:56:26.609332255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:26.611283 containerd[2022]: time="2026-03-07T00:56:26.611006999Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 6.581876097s" Mar 7 00:56:26.611283 containerd[2022]: time="2026-03-07T00:56:26.611071835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 7 00:56:26.614626 containerd[2022]: time="2026-03-07T00:56:26.614351483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:56:26.622960 containerd[2022]: time="2026-03-07T00:56:26.622858607Z" level=info msg="CreateContainer within sandbox \"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 00:56:26.651680 containerd[2022]: time="2026-03-07T00:56:26.651509735Z" level=info msg="CreateContainer within sandbox \"a1156d78fdc78391507319b52dcda199044d85625a1e8f0d4a7aa7f081059a9b\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3\"" Mar 7 00:56:26.653803 containerd[2022]: time="2026-03-07T00:56:26.652522115Z" level=info msg="StartContainer for \"bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3\"" Mar 7 00:56:26.747098 systemd[1]: Started cri-containerd-bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3.scope - libcontainer container bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3. Mar 7 00:56:26.839120 containerd[2022]: time="2026-03-07T00:56:26.839062788Z" level=info msg="StartContainer for \"bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3\" returns successfully" Mar 7 00:56:26.966372 containerd[2022]: time="2026-03-07T00:56:26.966285133Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:26.969218 containerd[2022]: time="2026-03-07T00:56:26.968128117Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 00:56:26.973464 containerd[2022]: time="2026-03-07T00:56:26.973399933Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 358.558262ms" Mar 7 00:56:26.973464 containerd[2022]: time="2026-03-07T00:56:26.973466149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:56:26.976998 containerd[2022]: time="2026-03-07T00:56:26.976034053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 00:56:26.986771 containerd[2022]: time="2026-03-07T00:56:26.986708533Z" level=info msg="CreateContainer within sandbox \"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:56:27.016026 containerd[2022]: time="2026-03-07T00:56:27.015907137Z" level=info msg="CreateContainer within sandbox \"5af5df7476481ca8689bf47b81da0cae607c82af6ed5ab68273431f783672202\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8f5750faf08ecd59d6812ea453dfe310078f164504216d96ccf911f3272e7a97\"" Mar 7 00:56:27.018653 containerd[2022]: time="2026-03-07T00:56:27.018598929Z" level=info msg="StartContainer for \"8f5750faf08ecd59d6812ea453dfe310078f164504216d96ccf911f3272e7a97\"" Mar 7 00:56:27.070719 systemd[1]: Started cri-containerd-8f5750faf08ecd59d6812ea453dfe310078f164504216d96ccf911f3272e7a97.scope - libcontainer container 8f5750faf08ecd59d6812ea453dfe310078f164504216d96ccf911f3272e7a97. Mar 7 00:56:27.151437 containerd[2022]: time="2026-03-07T00:56:27.151327462Z" level=info msg="StartContainer for \"8f5750faf08ecd59d6812ea453dfe310078f164504216d96ccf911f3272e7a97\" returns successfully" Mar 7 00:56:27.233645 kubelet[3252]: I0307 00:56:27.233450 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-7fc9dd7d97-2bt9t" podStartSLOduration=33.784927699 podStartE2EDuration="39.232085566s" podCreationTimestamp="2026-03-07 00:55:48 +0000 UTC" firstStartedPulling="2026-03-07 00:56:14.581517215 +0000 UTC m=+52.574731858" lastFinishedPulling="2026-03-07 00:56:20.028675058 +0000 UTC m=+58.021889725" observedRunningTime="2026-03-07 00:56:21.127161556 +0000 UTC m=+59.120376211" watchObservedRunningTime="2026-03-07 00:56:27.232085566 +0000 UTC m=+65.225300233" Mar 7 00:56:27.273153 kubelet[3252]: I0307 00:56:27.273004 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-7fc9dd7d97-2zgnf" podStartSLOduration=27.527010671 podStartE2EDuration="39.272979118s" podCreationTimestamp="2026-03-07 00:55:48 +0000 UTC" firstStartedPulling="2026-03-07 00:56:15.22958143 +0000 UTC m=+53.222796061" lastFinishedPulling="2026-03-07 00:56:26.975549781 +0000 UTC m=+64.968764508" observedRunningTime="2026-03-07 00:56:27.237584362 +0000 UTC m=+65.230799017" watchObservedRunningTime="2026-03-07 00:56:27.272979118 +0000 UTC m=+65.266193785" Mar 7 00:56:28.211216 kubelet[3252]: I0307 00:56:28.211143 3252 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:56:28.801150 systemd[1]: Started sshd@10-172.31.23.166:22-20.161.92.111:41562.service - OpenSSH per-connection server daemon (20.161.92.111:41562). Mar 7 00:56:29.342087 sshd[6297]: Accepted publickey for core from 20.161.92.111 port 41562 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:29.353911 systemd[1]: run-containerd-runc-k8s.io-bc359948225cd0ec8fd046a44ac027575887a95feed98e43a3662a1c66d92ff3-runc.judDmB.mount: Deactivated successfully. Mar 7 00:56:29.355145 sshd[6297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:29.370953 systemd-logind[2004]: New session 10 of user core. Mar 7 00:56:29.385771 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 00:56:29.998295 sshd[6297]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:30.012005 systemd[1]: sshd@10-172.31.23.166:22-20.161.92.111:41562.service: Deactivated successfully. Mar 7 00:56:30.023806 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 00:56:30.034844 systemd-logind[2004]: Session 10 logged out. Waiting for processes to exit. Mar 7 00:56:30.040232 systemd-logind[2004]: Removed session 10. Mar 7 00:56:30.811612 containerd[2022]: time="2026-03-07T00:56:30.811517548Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:30.814764 containerd[2022]: time="2026-03-07T00:56:30.814677064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 7 00:56:30.817352 containerd[2022]: time="2026-03-07T00:56:30.817265776Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:30.831441 containerd[2022]: time="2026-03-07T00:56:30.830261728Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:30.832706 containerd[2022]: time="2026-03-07T00:56:30.832644256Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.856542655s" Mar 7 00:56:30.832921 containerd[2022]: time="2026-03-07T00:56:30.832886464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 7 00:56:30.835163 containerd[2022]: time="2026-03-07T00:56:30.835090024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 00:56:30.878894 containerd[2022]: time="2026-03-07T00:56:30.878453680Z" level=info msg="CreateContainer within sandbox \"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 00:56:30.906264 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount387273306.mount: Deactivated successfully. Mar 7 00:56:30.913854 containerd[2022]: time="2026-03-07T00:56:30.911585908Z" level=info msg="CreateContainer within sandbox \"6aeca00fc365992e8a31a56331ffeb0da3cda5404d596208e8ad79aa9ae79dad\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2672b27580849af899567a0927144f15d1ac245455e03fcccbbcbf310860b0fb\"" Mar 7 00:56:30.916131 containerd[2022]: time="2026-03-07T00:56:30.915253372Z" level=info msg="StartContainer for \"2672b27580849af899567a0927144f15d1ac245455e03fcccbbcbf310860b0fb\"" Mar 7 00:56:31.001721 systemd[1]: Started cri-containerd-2672b27580849af899567a0927144f15d1ac245455e03fcccbbcbf310860b0fb.scope - libcontainer container 2672b27580849af899567a0927144f15d1ac245455e03fcccbbcbf310860b0fb. Mar 7 00:56:31.090043 containerd[2022]: time="2026-03-07T00:56:31.089708557Z" level=info msg="StartContainer for \"2672b27580849af899567a0927144f15d1ac245455e03fcccbbcbf310860b0fb\" returns successfully" Mar 7 00:56:31.274045 kubelet[3252]: I0307 00:56:31.272860 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7dd69967b5-wbk4v" podStartSLOduration=25.817310797 podStartE2EDuration="41.272811146s" podCreationTimestamp="2026-03-07 00:55:50 +0000 UTC" firstStartedPulling="2026-03-07 00:56:15.378859979 +0000 UTC m=+53.372074622" lastFinishedPulling="2026-03-07 00:56:30.834360316 +0000 UTC m=+68.827574971" observedRunningTime="2026-03-07 00:56:31.272273666 +0000 UTC m=+69.265488405" watchObservedRunningTime="2026-03-07 00:56:31.272811146 +0000 UTC m=+69.266025789" Mar 7 00:56:31.274045 kubelet[3252]: I0307 00:56:31.273324 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-plrrb" podStartSLOduration=32.554950828 podStartE2EDuration="44.273312086s" podCreationTimestamp="2026-03-07 00:55:47 +0000 UTC" firstStartedPulling="2026-03-07 00:56:14.895093249 +0000 UTC m=+52.888307892" lastFinishedPulling="2026-03-07 00:56:26.613454507 +0000 UTC m=+64.606669150" observedRunningTime="2026-03-07 00:56:27.273930634 +0000 UTC m=+65.267145301" watchObservedRunningTime="2026-03-07 00:56:31.273312086 +0000 UTC m=+69.266526753" Mar 7 00:56:32.315364 containerd[2022]: time="2026-03-07T00:56:32.315286335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:32.320433 containerd[2022]: time="2026-03-07T00:56:32.319247235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 7 00:56:32.325358 containerd[2022]: time="2026-03-07T00:56:32.324548931Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:32.331062 containerd[2022]: time="2026-03-07T00:56:32.330980643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:32.336169 containerd[2022]: time="2026-03-07T00:56:32.336091875Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.498744267s" Mar 7 00:56:32.336169 containerd[2022]: time="2026-03-07T00:56:32.336154767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 7 00:56:32.342420 containerd[2022]: time="2026-03-07T00:56:32.341234787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 00:56:32.354442 containerd[2022]: time="2026-03-07T00:56:32.354199024Z" level=info msg="CreateContainer within sandbox \"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 00:56:32.402435 containerd[2022]: time="2026-03-07T00:56:32.402331456Z" level=info msg="CreateContainer within sandbox \"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"1bae71ab772dfc102541fb203d8ee85ff42b3cc16d7bf8f6fc03606bdb5bc8da\"" Mar 7 00:56:32.404672 containerd[2022]: time="2026-03-07T00:56:32.404546860Z" level=info msg="StartContainer for \"1bae71ab772dfc102541fb203d8ee85ff42b3cc16d7bf8f6fc03606bdb5bc8da\"" Mar 7 00:56:32.472739 systemd[1]: Started cri-containerd-1bae71ab772dfc102541fb203d8ee85ff42b3cc16d7bf8f6fc03606bdb5bc8da.scope - libcontainer container 1bae71ab772dfc102541fb203d8ee85ff42b3cc16d7bf8f6fc03606bdb5bc8da. Mar 7 00:56:32.561200 containerd[2022]: time="2026-03-07T00:56:32.561010709Z" level=info msg="StartContainer for \"1bae71ab772dfc102541fb203d8ee85ff42b3cc16d7bf8f6fc03606bdb5bc8da\" returns successfully" Mar 7 00:56:33.909908 containerd[2022]: time="2026-03-07T00:56:33.909821299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:33.913636 containerd[2022]: time="2026-03-07T00:56:33.913573855Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 7 00:56:33.915991 containerd[2022]: time="2026-03-07T00:56:33.915908107Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:33.922422 containerd[2022]: time="2026-03-07T00:56:33.921940195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:33.924222 containerd[2022]: time="2026-03-07T00:56:33.923600791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.582283204s" Mar 7 00:56:33.924222 containerd[2022]: time="2026-03-07T00:56:33.923669311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 7 00:56:33.928255 containerd[2022]: time="2026-03-07T00:56:33.927200887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 00:56:33.934962 containerd[2022]: time="2026-03-07T00:56:33.934878943Z" level=info msg="CreateContainer within sandbox \"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 00:56:33.975699 containerd[2022]: time="2026-03-07T00:56:33.975087068Z" level=info msg="CreateContainer within sandbox \"8c238a91cb6cf02702a6e6f502c7c1e5a6e6498982a2e838b72b788f4f904f8a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"eb777b8b47030b2c1e68b9e92c1cb7f338ef5a61adbdd77c5adb913e45ed9b8c\"" Mar 7 00:56:33.978577 containerd[2022]: time="2026-03-07T00:56:33.977688608Z" level=info msg="StartContainer for \"eb777b8b47030b2c1e68b9e92c1cb7f338ef5a61adbdd77c5adb913e45ed9b8c\"" Mar 7 00:56:34.068108 systemd[1]: Started cri-containerd-eb777b8b47030b2c1e68b9e92c1cb7f338ef5a61adbdd77c5adb913e45ed9b8c.scope - libcontainer container eb777b8b47030b2c1e68b9e92c1cb7f338ef5a61adbdd77c5adb913e45ed9b8c. Mar 7 00:56:34.136112 containerd[2022]: time="2026-03-07T00:56:34.136016320Z" level=info msg="StartContainer for \"eb777b8b47030b2c1e68b9e92c1cb7f338ef5a61adbdd77c5adb913e45ed9b8c\" returns successfully" Mar 7 00:56:34.276208 kubelet[3252]: I0307 00:56:34.275942 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-tgsjl" podStartSLOduration=22.70169893 podStartE2EDuration="44.275919113s" podCreationTimestamp="2026-03-07 00:55:50 +0000 UTC" firstStartedPulling="2026-03-07 00:56:12.351892592 +0000 UTC m=+50.345107235" lastFinishedPulling="2026-03-07 00:56:33.926112787 +0000 UTC m=+71.919327418" observedRunningTime="2026-03-07 00:56:34.273696449 +0000 UTC m=+72.266911104" watchObservedRunningTime="2026-03-07 00:56:34.275919113 +0000 UTC m=+72.269133756" Mar 7 00:56:34.517148 kubelet[3252]: I0307 00:56:34.517087 3252 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 00:56:34.517335 kubelet[3252]: I0307 00:56:34.517173 3252 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 00:56:35.092913 systemd[1]: Started sshd@11-172.31.23.166:22-20.161.92.111:59544.service - OpenSSH per-connection server daemon (20.161.92.111:59544). Mar 7 00:56:35.629210 sshd[6509]: Accepted publickey for core from 20.161.92.111 port 59544 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:35.635994 sshd[6509]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:35.652192 systemd-logind[2004]: New session 11 of user core. Mar 7 00:56:35.660067 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 00:56:35.872833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2252762000.mount: Deactivated successfully. Mar 7 00:56:35.922905 containerd[2022]: time="2026-03-07T00:56:35.922702797Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:35.925450 containerd[2022]: time="2026-03-07T00:56:35.925060437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 7 00:56:35.929225 containerd[2022]: time="2026-03-07T00:56:35.929127465Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:35.936931 containerd[2022]: time="2026-03-07T00:56:35.936807441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:35.943515 containerd[2022]: time="2026-03-07T00:56:35.942883017Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.015590798s" Mar 7 00:56:35.944779 containerd[2022]: time="2026-03-07T00:56:35.943922877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 7 00:56:35.961741 containerd[2022]: time="2026-03-07T00:56:35.961485369Z" level=info msg="CreateContainer within sandbox \"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 00:56:35.991177 containerd[2022]: time="2026-03-07T00:56:35.990829882Z" level=info msg="CreateContainer within sandbox \"7cb1af9f888e82a9a51826f30862a289ea89cab3f165d1edc0454f0b0c9fc3b1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0f88a152cf6ee29eea18abe4924a3e6cea4d693c7a9e7d88d400ae95c56ffe3c\"" Mar 7 00:56:35.995008 containerd[2022]: time="2026-03-07T00:56:35.994497514Z" level=info msg="StartContainer for \"0f88a152cf6ee29eea18abe4924a3e6cea4d693c7a9e7d88d400ae95c56ffe3c\"" Mar 7 00:56:36.110793 systemd[1]: Started cri-containerd-0f88a152cf6ee29eea18abe4924a3e6cea4d693c7a9e7d88d400ae95c56ffe3c.scope - libcontainer container 0f88a152cf6ee29eea18abe4924a3e6cea4d693c7a9e7d88d400ae95c56ffe3c. Mar 7 00:56:36.253506 containerd[2022]: time="2026-03-07T00:56:36.252538243Z" level=info msg="StartContainer for \"0f88a152cf6ee29eea18abe4924a3e6cea4d693c7a9e7d88d400ae95c56ffe3c\" returns successfully" Mar 7 00:56:36.260526 sshd[6509]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:36.270863 systemd[1]: sshd@11-172.31.23.166:22-20.161.92.111:59544.service: Deactivated successfully. Mar 7 00:56:36.280870 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 00:56:36.294234 systemd-logind[2004]: Session 11 logged out. Waiting for processes to exit. Mar 7 00:56:36.299227 systemd-logind[2004]: Removed session 11. Mar 7 00:56:36.369598 systemd[1]: Started sshd@12-172.31.23.166:22-20.161.92.111:59546.service - OpenSSH per-connection server daemon (20.161.92.111:59546). Mar 7 00:56:36.884115 sshd[6573]: Accepted publickey for core from 20.161.92.111 port 59546 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:36.887869 sshd[6573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:36.895733 systemd-logind[2004]: New session 12 of user core. Mar 7 00:56:36.904713 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 00:56:37.470079 sshd[6573]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:37.478677 systemd-logind[2004]: Session 12 logged out. Waiting for processes to exit. Mar 7 00:56:37.479909 systemd[1]: sshd@12-172.31.23.166:22-20.161.92.111:59546.service: Deactivated successfully. Mar 7 00:56:37.487214 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 00:56:37.492345 systemd-logind[2004]: Removed session 12. Mar 7 00:56:37.567009 systemd[1]: Started sshd@13-172.31.23.166:22-20.161.92.111:59560.service - OpenSSH per-connection server daemon (20.161.92.111:59560). Mar 7 00:56:37.655012 kubelet[3252]: I0307 00:56:37.654705 3252 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:56:37.707367 kubelet[3252]: I0307 00:56:37.707259 3252 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-5844d5c4cc-z7mbs" podStartSLOduration=4.713605083 podStartE2EDuration="24.707237794s" podCreationTimestamp="2026-03-07 00:56:13 +0000 UTC" firstStartedPulling="2026-03-07 00:56:15.954317234 +0000 UTC m=+53.947531877" lastFinishedPulling="2026-03-07 00:56:35.947949945 +0000 UTC m=+73.941164588" observedRunningTime="2026-03-07 00:56:36.306138487 +0000 UTC m=+74.299353130" watchObservedRunningTime="2026-03-07 00:56:37.707237794 +0000 UTC m=+75.700452437" Mar 7 00:56:38.094185 sshd[6597]: Accepted publickey for core from 20.161.92.111 port 59560 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:38.097359 sshd[6597]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:38.111357 systemd-logind[2004]: New session 13 of user core. Mar 7 00:56:38.119695 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 00:56:38.611288 sshd[6597]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:38.619196 systemd-logind[2004]: Session 13 logged out. Waiting for processes to exit. Mar 7 00:56:38.619532 systemd[1]: sshd@13-172.31.23.166:22-20.161.92.111:59560.service: Deactivated successfully. Mar 7 00:56:38.625424 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 00:56:38.632050 systemd-logind[2004]: Removed session 13. Mar 7 00:56:43.706968 systemd[1]: Started sshd@14-172.31.23.166:22-20.161.92.111:58930.service - OpenSSH per-connection server daemon (20.161.92.111:58930). Mar 7 00:56:44.225299 sshd[6649]: Accepted publickey for core from 20.161.92.111 port 58930 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:44.229212 sshd[6649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:44.239488 systemd-logind[2004]: New session 14 of user core. Mar 7 00:56:44.244699 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 00:56:44.719282 sshd[6649]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:44.731072 systemd[1]: sshd@14-172.31.23.166:22-20.161.92.111:58930.service: Deactivated successfully. Mar 7 00:56:44.734843 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 00:56:44.738356 systemd-logind[2004]: Session 14 logged out. Waiting for processes to exit. Mar 7 00:56:44.742344 systemd-logind[2004]: Removed session 14. Mar 7 00:56:44.815966 systemd[1]: Started sshd@15-172.31.23.166:22-20.161.92.111:58940.service - OpenSSH per-connection server daemon (20.161.92.111:58940). Mar 7 00:56:45.326684 sshd[6661]: Accepted publickey for core from 20.161.92.111 port 58940 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:45.329577 sshd[6661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:45.339344 systemd-logind[2004]: New session 15 of user core. Mar 7 00:56:45.346684 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 00:56:46.134295 sshd[6661]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:46.141872 systemd[1]: sshd@15-172.31.23.166:22-20.161.92.111:58940.service: Deactivated successfully. Mar 7 00:56:46.147581 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 00:56:46.149080 systemd-logind[2004]: Session 15 logged out. Waiting for processes to exit. Mar 7 00:56:46.152921 systemd-logind[2004]: Removed session 15. Mar 7 00:56:46.237093 systemd[1]: Started sshd@16-172.31.23.166:22-20.161.92.111:58946.service - OpenSSH per-connection server daemon (20.161.92.111:58946). Mar 7 00:56:46.752466 sshd[6673]: Accepted publickey for core from 20.161.92.111 port 58946 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:46.754700 sshd[6673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:46.762718 systemd-logind[2004]: New session 16 of user core. Mar 7 00:56:46.769644 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 00:56:48.194043 sshd[6673]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:48.204457 systemd[1]: sshd@16-172.31.23.166:22-20.161.92.111:58946.service: Deactivated successfully. Mar 7 00:56:48.212125 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 00:56:48.221107 systemd-logind[2004]: Session 16 logged out. Waiting for processes to exit. Mar 7 00:56:48.225841 systemd-logind[2004]: Removed session 16. Mar 7 00:56:48.287970 systemd[1]: Started sshd@17-172.31.23.166:22-20.161.92.111:58952.service - OpenSSH per-connection server daemon (20.161.92.111:58952). Mar 7 00:56:48.798328 sshd[6701]: Accepted publickey for core from 20.161.92.111 port 58952 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:48.802744 sshd[6701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:48.817456 systemd-logind[2004]: New session 17 of user core. Mar 7 00:56:48.822721 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 00:56:49.545538 sshd[6701]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:49.553309 systemd[1]: sshd@17-172.31.23.166:22-20.161.92.111:58952.service: Deactivated successfully. Mar 7 00:56:49.559144 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 00:56:49.560801 systemd-logind[2004]: Session 17 logged out. Waiting for processes to exit. Mar 7 00:56:49.563082 systemd-logind[2004]: Removed session 17. Mar 7 00:56:49.646094 systemd[1]: Started sshd@18-172.31.23.166:22-20.161.92.111:58956.service - OpenSSH per-connection server daemon (20.161.92.111:58956). Mar 7 00:56:50.151439 sshd[6713]: Accepted publickey for core from 20.161.92.111 port 58956 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:50.154041 sshd[6713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:50.164453 systemd-logind[2004]: New session 18 of user core. Mar 7 00:56:50.174777 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 00:56:50.630961 sshd[6713]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:50.637210 systemd[1]: sshd@18-172.31.23.166:22-20.161.92.111:58956.service: Deactivated successfully. Mar 7 00:56:50.638013 systemd-logind[2004]: Session 18 logged out. Waiting for processes to exit. Mar 7 00:56:50.643283 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 00:56:50.647956 systemd-logind[2004]: Removed session 18. Mar 7 00:56:55.728873 systemd[1]: Started sshd@19-172.31.23.166:22-20.161.92.111:51054.service - OpenSSH per-connection server daemon (20.161.92.111:51054). Mar 7 00:56:56.247433 sshd[6772]: Accepted publickey for core from 20.161.92.111 port 51054 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:56:56.252974 sshd[6772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:56.263043 systemd-logind[2004]: New session 19 of user core. Mar 7 00:56:56.269728 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 00:56:56.725983 sshd[6772]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:56.733005 systemd[1]: sshd@19-172.31.23.166:22-20.161.92.111:51054.service: Deactivated successfully. Mar 7 00:56:56.739129 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 00:56:56.741375 systemd-logind[2004]: Session 19 logged out. Waiting for processes to exit. Mar 7 00:56:56.744026 systemd-logind[2004]: Removed session 19. Mar 7 00:57:00.929165 kubelet[3252]: I0307 00:57:00.929111 3252 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:57:01.823964 systemd[1]: Started sshd@20-172.31.23.166:22-20.161.92.111:40000.service - OpenSSH per-connection server daemon (20.161.92.111:40000). Mar 7 00:57:02.337225 sshd[6827]: Accepted publickey for core from 20.161.92.111 port 40000 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:02.340652 sshd[6827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:02.355275 systemd-logind[2004]: New session 20 of user core. Mar 7 00:57:02.360721 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 7 00:57:02.862721 sshd[6827]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:02.872956 systemd[1]: sshd@20-172.31.23.166:22-20.161.92.111:40000.service: Deactivated successfully. Mar 7 00:57:02.878941 systemd[1]: session-20.scope: Deactivated successfully. Mar 7 00:57:02.880722 systemd-logind[2004]: Session 20 logged out. Waiting for processes to exit. Mar 7 00:57:02.885232 systemd-logind[2004]: Removed session 20. Mar 7 00:57:07.961906 systemd[1]: Started sshd@21-172.31.23.166:22-20.161.92.111:40016.service - OpenSSH per-connection server daemon (20.161.92.111:40016). Mar 7 00:57:08.489501 sshd[6860]: Accepted publickey for core from 20.161.92.111 port 40016 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:08.492770 sshd[6860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:08.501583 systemd-logind[2004]: New session 21 of user core. Mar 7 00:57:08.508826 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 7 00:57:08.966298 sshd[6860]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:08.974234 systemd[1]: sshd@21-172.31.23.166:22-20.161.92.111:40016.service: Deactivated successfully. Mar 7 00:57:08.979157 systemd[1]: session-21.scope: Deactivated successfully. Mar 7 00:57:08.981189 systemd-logind[2004]: Session 21 logged out. Waiting for processes to exit. Mar 7 00:57:08.983246 systemd-logind[2004]: Removed session 21. Mar 7 00:57:14.069597 systemd[1]: Started sshd@22-172.31.23.166:22-20.161.92.111:55718.service - OpenSSH per-connection server daemon (20.161.92.111:55718). Mar 7 00:57:14.604436 sshd[6894]: Accepted publickey for core from 20.161.92.111 port 55718 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:14.607756 sshd[6894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:14.618496 systemd-logind[2004]: New session 22 of user core. Mar 7 00:57:14.627184 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 7 00:57:15.140718 sshd[6894]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:15.149283 systemd[1]: sshd@22-172.31.23.166:22-20.161.92.111:55718.service: Deactivated successfully. Mar 7 00:57:15.157138 systemd[1]: session-22.scope: Deactivated successfully. Mar 7 00:57:15.159984 systemd-logind[2004]: Session 22 logged out. Waiting for processes to exit. Mar 7 00:57:15.163088 systemd-logind[2004]: Removed session 22. Mar 7 00:57:20.238893 systemd[1]: Started sshd@23-172.31.23.166:22-20.161.92.111:36288.service - OpenSSH per-connection server daemon (20.161.92.111:36288). Mar 7 00:57:20.743440 sshd[6907]: Accepted publickey for core from 20.161.92.111 port 36288 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:20.746954 sshd[6907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:20.754735 systemd-logind[2004]: New session 23 of user core. Mar 7 00:57:20.759674 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 7 00:57:21.225093 sshd[6907]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:21.232234 systemd[1]: sshd@23-172.31.23.166:22-20.161.92.111:36288.service: Deactivated successfully. Mar 7 00:57:21.237808 systemd[1]: session-23.scope: Deactivated successfully. Mar 7 00:57:21.239515 systemd-logind[2004]: Session 23 logged out. Waiting for processes to exit. Mar 7 00:57:21.241848 systemd-logind[2004]: Removed session 23. Mar 7 00:57:34.766295 systemd[1]: cri-containerd-802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2.scope: Deactivated successfully. Mar 7 00:57:34.766817 systemd[1]: cri-containerd-802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2.scope: Consumed 23.508s CPU time. Mar 7 00:57:34.815427 containerd[2022]: time="2026-03-07T00:57:34.815308242Z" level=info msg="shim disconnected" id=802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2 namespace=k8s.io Mar 7 00:57:34.815427 containerd[2022]: time="2026-03-07T00:57:34.815421102Z" level=warning msg="cleaning up after shim disconnected" id=802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2 namespace=k8s.io Mar 7 00:57:34.816091 containerd[2022]: time="2026-03-07T00:57:34.815444610Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:34.818378 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2-rootfs.mount: Deactivated successfully. Mar 7 00:57:35.484915 kubelet[3252]: I0307 00:57:35.484869 3252 scope.go:122] "RemoveContainer" containerID="802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2" Mar 7 00:57:35.490592 containerd[2022]: time="2026-03-07T00:57:35.490533317Z" level=info msg="CreateContainer within sandbox \"e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 7 00:57:35.519960 containerd[2022]: time="2026-03-07T00:57:35.519769457Z" level=info msg="CreateContainer within sandbox \"e8388d99e56bb84c87e8e9ca2d6a0a97ad3396425723b530d421c5a1bdc14310\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d\"" Mar 7 00:57:35.520781 containerd[2022]: time="2026-03-07T00:57:35.520639049Z" level=info msg="StartContainer for \"4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d\"" Mar 7 00:57:35.593716 systemd[1]: Started cri-containerd-4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d.scope - libcontainer container 4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d. Mar 7 00:57:35.643478 containerd[2022]: time="2026-03-07T00:57:35.643276710Z" level=info msg="StartContainer for \"4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d\" returns successfully" Mar 7 00:57:35.855209 systemd[1]: cri-containerd-d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be.scope: Deactivated successfully. Mar 7 00:57:35.855725 systemd[1]: cri-containerd-d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be.scope: Consumed 3.813s CPU time, 16.0M memory peak, 0B memory swap peak. Mar 7 00:57:35.906731 containerd[2022]: time="2026-03-07T00:57:35.906639319Z" level=info msg="shim disconnected" id=d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be namespace=k8s.io Mar 7 00:57:35.906731 containerd[2022]: time="2026-03-07T00:57:35.906723487Z" level=warning msg="cleaning up after shim disconnected" id=d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be namespace=k8s.io Mar 7 00:57:35.909831 containerd[2022]: time="2026-03-07T00:57:35.906745747Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:35.911685 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be-rootfs.mount: Deactivated successfully. Mar 7 00:57:36.498563 kubelet[3252]: I0307 00:57:36.498506 3252 scope.go:122] "RemoveContainer" containerID="d3201dac517dfed41afccf5253455bfd4bbb94a0f1726216bd50b78f2f6949be" Mar 7 00:57:36.502727 containerd[2022]: time="2026-03-07T00:57:36.502444518Z" level=info msg="CreateContainer within sandbox \"4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 7 00:57:36.534239 containerd[2022]: time="2026-03-07T00:57:36.533839158Z" level=info msg="CreateContainer within sandbox \"4caff063bba94c5215764aec5a981e57c1745f35b937e6042e3375765add9f8e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"582bb4bfc94a2834e6accb20b03d32a24318b98ece2a790bb8139e251da18a4b\"" Mar 7 00:57:36.535504 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1992335772.mount: Deactivated successfully. Mar 7 00:57:36.536594 containerd[2022]: time="2026-03-07T00:57:36.535661334Z" level=info msg="StartContainer for \"582bb4bfc94a2834e6accb20b03d32a24318b98ece2a790bb8139e251da18a4b\"" Mar 7 00:57:36.590709 systemd[1]: Started cri-containerd-582bb4bfc94a2834e6accb20b03d32a24318b98ece2a790bb8139e251da18a4b.scope - libcontainer container 582bb4bfc94a2834e6accb20b03d32a24318b98ece2a790bb8139e251da18a4b. Mar 7 00:57:36.662329 containerd[2022]: time="2026-03-07T00:57:36.661754863Z" level=info msg="StartContainer for \"582bb4bfc94a2834e6accb20b03d32a24318b98ece2a790bb8139e251da18a4b\" returns successfully" Mar 7 00:57:40.720908 systemd[1]: cri-containerd-4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575.scope: Deactivated successfully. Mar 7 00:57:40.721478 systemd[1]: cri-containerd-4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575.scope: Consumed 2.796s CPU time, 16.3M memory peak, 0B memory swap peak. Mar 7 00:57:40.764282 containerd[2022]: time="2026-03-07T00:57:40.764189735Z" level=info msg="shim disconnected" id=4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575 namespace=k8s.io Mar 7 00:57:40.764282 containerd[2022]: time="2026-03-07T00:57:40.764269979Z" level=warning msg="cleaning up after shim disconnected" id=4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575 namespace=k8s.io Mar 7 00:57:40.764965 containerd[2022]: time="2026-03-07T00:57:40.764293547Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:40.768055 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575-rootfs.mount: Deactivated successfully. Mar 7 00:57:41.521909 kubelet[3252]: I0307 00:57:41.521853 3252 scope.go:122] "RemoveContainer" containerID="4746ee6df25e6d06cf8c1e2a3a3dc29613f0649ed5d1a87132643c9474088575" Mar 7 00:57:41.527639 containerd[2022]: time="2026-03-07T00:57:41.526710395Z" level=info msg="CreateContainer within sandbox \"7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 7 00:57:41.563059 containerd[2022]: time="2026-03-07T00:57:41.562973447Z" level=info msg="CreateContainer within sandbox \"7deeea9ce8d5dcf36b3add3e15adc9678a6f30fe2f90e9abb907abc384a60926\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"02a8eb439476861bc025f970e3d28bf1605115cf2447455e7a752927deb9aeb2\"" Mar 7 00:57:41.563981 containerd[2022]: time="2026-03-07T00:57:41.563935919Z" level=info msg="StartContainer for \"02a8eb439476861bc025f970e3d28bf1605115cf2447455e7a752927deb9aeb2\"" Mar 7 00:57:41.625762 systemd[1]: Started cri-containerd-02a8eb439476861bc025f970e3d28bf1605115cf2447455e7a752927deb9aeb2.scope - libcontainer container 02a8eb439476861bc025f970e3d28bf1605115cf2447455e7a752927deb9aeb2. Mar 7 00:57:41.695321 containerd[2022]: time="2026-03-07T00:57:41.695246472Z" level=info msg="StartContainer for \"02a8eb439476861bc025f970e3d28bf1605115cf2447455e7a752927deb9aeb2\" returns successfully" Mar 7 00:57:44.696139 kubelet[3252]: E0307 00:57:44.690836 3252 controller.go:251] "Failed to update lease" err="Put \"https://172.31.23.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-166?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 7 00:57:47.121907 systemd[1]: cri-containerd-4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d.scope: Deactivated successfully. Mar 7 00:57:47.166849 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d-rootfs.mount: Deactivated successfully. Mar 7 00:57:47.176440 containerd[2022]: time="2026-03-07T00:57:47.176322327Z" level=info msg="shim disconnected" id=4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d namespace=k8s.io Mar 7 00:57:47.176440 containerd[2022]: time="2026-03-07T00:57:47.176436483Z" level=warning msg="cleaning up after shim disconnected" id=4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d namespace=k8s.io Mar 7 00:57:47.177554 containerd[2022]: time="2026-03-07T00:57:47.176459751Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:47.548839 kubelet[3252]: I0307 00:57:47.548476 3252 scope.go:122] "RemoveContainer" containerID="802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2" Mar 7 00:57:47.550748 kubelet[3252]: I0307 00:57:47.550369 3252 scope.go:122] "RemoveContainer" containerID="4d3edd76435e575ddb851ecededf0187cc872cdb2a432acf863c596dc3067e5d" Mar 7 00:57:47.550748 kubelet[3252]: E0307 00:57:47.550668 3252 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6cf4cccc57-wjjbs_tigera-operator(dea686a5-e68f-4a2a-8c37-d728beae3dc9)\"" pod="tigera-operator/tigera-operator-6cf4cccc57-wjjbs" podUID="dea686a5-e68f-4a2a-8c37-d728beae3dc9" Mar 7 00:57:47.551789 containerd[2022]: time="2026-03-07T00:57:47.551722517Z" level=info msg="RemoveContainer for \"802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2\"" Mar 7 00:57:47.560863 containerd[2022]: time="2026-03-07T00:57:47.560810153Z" level=info msg="RemoveContainer for \"802b2347603ade21a8f0ec3200724d906e8e7425ef821d89c12f146781eacaf2\" returns successfully"