Mar 7 00:55:11.271062 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 7 00:55:11.271107 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Mar 6 22:59:59 -00 2026 Mar 7 00:55:11.271132 kernel: KASLR disabled due to lack of seed Mar 7 00:55:11.271150 kernel: efi: EFI v2.7 by EDK II Mar 7 00:55:11.272582 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7b001a98 MEMRESERVE=0x7852ee18 Mar 7 00:55:11.272660 kernel: ACPI: Early table checksum verification disabled Mar 7 00:55:11.273071 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 7 00:55:11.273094 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 7 00:55:11.273112 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 7 00:55:11.276288 kernel: ACPI: DSDT 0x0000000078640000 0013D2 (v02 AMAZON AMZNDSDT 00000001 AMZN 00000001) Mar 7 00:55:11.276319 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 7 00:55:11.276337 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 7 00:55:11.276353 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 7 00:55:11.276370 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 7 00:55:11.276390 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 7 00:55:11.276412 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 7 00:55:11.276430 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 7 00:55:11.276447 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 7 00:55:11.276465 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 7 00:55:11.276482 kernel: printk: bootconsole [uart0] enabled Mar 7 00:55:11.276499 kernel: NUMA: Failed to initialise from firmware Mar 7 00:55:11.276518 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 7 00:55:11.276536 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Mar 7 00:55:11.276554 kernel: Zone ranges: Mar 7 00:55:11.276572 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 7 00:55:11.276589 kernel: DMA32 empty Mar 7 00:55:11.276612 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 7 00:55:11.276630 kernel: Movable zone start for each node Mar 7 00:55:11.276648 kernel: Early memory node ranges Mar 7 00:55:11.276665 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 7 00:55:11.276683 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 7 00:55:11.276700 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 7 00:55:11.276717 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 7 00:55:11.276734 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 7 00:55:11.276751 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 7 00:55:11.276768 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 7 00:55:11.276786 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 7 00:55:11.276803 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 7 00:55:11.276824 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 7 00:55:11.276843 kernel: psci: probing for conduit method from ACPI. Mar 7 00:55:11.276867 kernel: psci: PSCIv1.0 detected in firmware. Mar 7 00:55:11.276885 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 00:55:11.276905 kernel: psci: Trusted OS migration not required Mar 7 00:55:11.276929 kernel: psci: SMC Calling Convention v1.1 Mar 7 00:55:11.276949 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000001) Mar 7 00:55:11.276967 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Mar 7 00:55:11.276986 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Mar 7 00:55:11.277005 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 00:55:11.277024 kernel: Detected PIPT I-cache on CPU0 Mar 7 00:55:11.277042 kernel: CPU features: detected: GIC system register CPU interface Mar 7 00:55:11.277061 kernel: CPU features: detected: Spectre-v2 Mar 7 00:55:11.277080 kernel: CPU features: detected: Spectre-v3a Mar 7 00:55:11.277099 kernel: CPU features: detected: Spectre-BHB Mar 7 00:55:11.277117 kernel: CPU features: detected: ARM erratum 1742098 Mar 7 00:55:11.277139 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 7 00:55:11.277158 kernel: alternatives: applying boot alternatives Mar 7 00:55:11.277205 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:55:11.277225 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 00:55:11.277243 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 00:55:11.277261 kernel: Fallback order for Node 0: 0 Mar 7 00:55:11.277279 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 7 00:55:11.277297 kernel: Policy zone: Normal Mar 7 00:55:11.277315 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 00:55:11.277334 kernel: software IO TLB: area num 2. Mar 7 00:55:11.277352 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 7 00:55:11.277379 kernel: Memory: 3820096K/4030464K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 210368K reserved, 0K cma-reserved) Mar 7 00:55:11.277398 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 00:55:11.277417 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 00:55:11.277436 kernel: rcu: RCU event tracing is enabled. Mar 7 00:55:11.277455 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 00:55:11.277474 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 00:55:11.277493 kernel: Tracing variant of Tasks RCU enabled. Mar 7 00:55:11.277512 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 00:55:11.277531 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 00:55:11.277550 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 00:55:11.277569 kernel: GICv3: 96 SPIs implemented Mar 7 00:55:11.277593 kernel: GICv3: 0 Extended SPIs implemented Mar 7 00:55:11.277613 kernel: Root IRQ handler: gic_handle_irq Mar 7 00:55:11.277631 kernel: GICv3: GICv3 features: 16 PPIs Mar 7 00:55:11.277650 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 7 00:55:11.277668 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 7 00:55:11.277686 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Mar 7 00:55:11.277705 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Mar 7 00:55:11.277723 kernel: GICv3: using LPI property table @0x00000004000d0000 Mar 7 00:55:11.277741 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 7 00:55:11.277759 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Mar 7 00:55:11.277777 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 00:55:11.277795 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 7 00:55:11.277817 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 7 00:55:11.277835 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 7 00:55:11.277854 kernel: Console: colour dummy device 80x25 Mar 7 00:55:11.277872 kernel: printk: console [tty1] enabled Mar 7 00:55:11.277890 kernel: ACPI: Core revision 20230628 Mar 7 00:55:11.277909 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 7 00:55:11.277927 kernel: pid_max: default: 32768 minimum: 301 Mar 7 00:55:11.277946 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 00:55:11.277964 kernel: landlock: Up and running. Mar 7 00:55:11.277986 kernel: SELinux: Initializing. Mar 7 00:55:11.278005 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:55:11.278023 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:55:11.278042 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:55:11.278060 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:55:11.278078 kernel: rcu: Hierarchical SRCU implementation. Mar 7 00:55:11.278097 kernel: rcu: Max phase no-delay instances is 400. Mar 7 00:55:11.278115 kernel: Platform MSI: ITS@0x10080000 domain created Mar 7 00:55:11.278133 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 7 00:55:11.278155 kernel: Remapping and enabling EFI services. Mar 7 00:55:11.280275 kernel: smp: Bringing up secondary CPUs ... Mar 7 00:55:11.280299 kernel: Detected PIPT I-cache on CPU1 Mar 7 00:55:11.280319 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 7 00:55:11.280338 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Mar 7 00:55:11.280357 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 7 00:55:11.280375 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 00:55:11.280394 kernel: SMP: Total of 2 processors activated. Mar 7 00:55:11.280412 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 00:55:11.280440 kernel: CPU features: detected: 32-bit EL1 Support Mar 7 00:55:11.280459 kernel: CPU features: detected: CRC32 instructions Mar 7 00:55:11.280478 kernel: CPU: All CPU(s) started at EL1 Mar 7 00:55:11.280508 kernel: alternatives: applying system-wide alternatives Mar 7 00:55:11.280531 kernel: devtmpfs: initialized Mar 7 00:55:11.280551 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 00:55:11.280570 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 00:55:11.280589 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 00:55:11.280608 kernel: SMBIOS 3.0.0 present. Mar 7 00:55:11.280632 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 7 00:55:11.280651 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 00:55:11.280670 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 00:55:11.280690 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 00:55:11.280709 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 00:55:11.280729 kernel: audit: initializing netlink subsys (disabled) Mar 7 00:55:11.280748 kernel: audit: type=2000 audit(0.287:1): state=initialized audit_enabled=0 res=1 Mar 7 00:55:11.280767 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 00:55:11.280790 kernel: cpuidle: using governor menu Mar 7 00:55:11.280809 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 00:55:11.280828 kernel: ASID allocator initialised with 65536 entries Mar 7 00:55:11.280847 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 00:55:11.280866 kernel: Serial: AMBA PL011 UART driver Mar 7 00:55:11.280885 kernel: Modules: 17488 pages in range for non-PLT usage Mar 7 00:55:11.280904 kernel: Modules: 509008 pages in range for PLT usage Mar 7 00:55:11.280923 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 00:55:11.280942 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 00:55:11.280965 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 00:55:11.280985 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 00:55:11.281004 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 00:55:11.281024 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 00:55:11.281043 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 00:55:11.281062 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 00:55:11.281081 kernel: ACPI: Added _OSI(Module Device) Mar 7 00:55:11.281100 kernel: ACPI: Added _OSI(Processor Device) Mar 7 00:55:11.281119 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 00:55:11.281142 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 00:55:11.281180 kernel: ACPI: Interpreter enabled Mar 7 00:55:11.281207 kernel: ACPI: Using GIC for interrupt routing Mar 7 00:55:11.281227 kernel: ACPI: MCFG table detected, 1 entries Mar 7 00:55:11.281246 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00]) Mar 7 00:55:11.281557 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 7 00:55:11.281785 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 7 00:55:11.281996 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 7 00:55:11.284279 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x200fffff] reserved by PNP0C02:00 Mar 7 00:55:11.284514 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x200fffff] for [bus 00] Mar 7 00:55:11.284542 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 7 00:55:11.284562 kernel: acpiphp: Slot [1] registered Mar 7 00:55:11.284583 kernel: acpiphp: Slot [2] registered Mar 7 00:55:11.284603 kernel: acpiphp: Slot [3] registered Mar 7 00:55:11.284622 kernel: acpiphp: Slot [4] registered Mar 7 00:55:11.284641 kernel: acpiphp: Slot [5] registered Mar 7 00:55:11.284668 kernel: acpiphp: Slot [6] registered Mar 7 00:55:11.284688 kernel: acpiphp: Slot [7] registered Mar 7 00:55:11.284707 kernel: acpiphp: Slot [8] registered Mar 7 00:55:11.284726 kernel: acpiphp: Slot [9] registered Mar 7 00:55:11.284745 kernel: acpiphp: Slot [10] registered Mar 7 00:55:11.284764 kernel: acpiphp: Slot [11] registered Mar 7 00:55:11.284783 kernel: acpiphp: Slot [12] registered Mar 7 00:55:11.284802 kernel: acpiphp: Slot [13] registered Mar 7 00:55:11.284820 kernel: acpiphp: Slot [14] registered Mar 7 00:55:11.284840 kernel: acpiphp: Slot [15] registered Mar 7 00:55:11.284864 kernel: acpiphp: Slot [16] registered Mar 7 00:55:11.284882 kernel: acpiphp: Slot [17] registered Mar 7 00:55:11.284902 kernel: acpiphp: Slot [18] registered Mar 7 00:55:11.284920 kernel: acpiphp: Slot [19] registered Mar 7 00:55:11.284939 kernel: acpiphp: Slot [20] registered Mar 7 00:55:11.284958 kernel: acpiphp: Slot [21] registered Mar 7 00:55:11.284977 kernel: acpiphp: Slot [22] registered Mar 7 00:55:11.284996 kernel: acpiphp: Slot [23] registered Mar 7 00:55:11.285015 kernel: acpiphp: Slot [24] registered Mar 7 00:55:11.285038 kernel: acpiphp: Slot [25] registered Mar 7 00:55:11.285057 kernel: acpiphp: Slot [26] registered Mar 7 00:55:11.285077 kernel: acpiphp: Slot [27] registered Mar 7 00:55:11.285096 kernel: acpiphp: Slot [28] registered Mar 7 00:55:11.285115 kernel: acpiphp: Slot [29] registered Mar 7 00:55:11.285134 kernel: acpiphp: Slot [30] registered Mar 7 00:55:11.285152 kernel: acpiphp: Slot [31] registered Mar 7 00:55:11.285201 kernel: PCI host bridge to bus 0000:00 Mar 7 00:55:11.285446 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 7 00:55:11.285656 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 7 00:55:11.285858 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 7 00:55:11.286063 kernel: pci_bus 0000:00: root bus resource [bus 00] Mar 7 00:55:11.291449 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 7 00:55:11.291711 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 7 00:55:11.291930 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 7 00:55:11.292218 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 7 00:55:11.292437 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 7 00:55:11.292650 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 7 00:55:11.292886 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 7 00:55:11.293097 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 7 00:55:11.295370 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 7 00:55:11.295598 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 7 00:55:11.295821 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 7 00:55:11.296060 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 7 00:55:11.296291 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 7 00:55:11.296487 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 7 00:55:11.296514 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 7 00:55:11.296534 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 7 00:55:11.296554 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 7 00:55:11.296574 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 7 00:55:11.296600 kernel: iommu: Default domain type: Translated Mar 7 00:55:11.296620 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 00:55:11.296639 kernel: efivars: Registered efivars operations Mar 7 00:55:11.296658 kernel: vgaarb: loaded Mar 7 00:55:11.296677 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 00:55:11.296696 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 00:55:11.296715 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 00:55:11.296734 kernel: pnp: PnP ACPI init Mar 7 00:55:11.296953 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 7 00:55:11.296986 kernel: pnp: PnP ACPI: found 1 devices Mar 7 00:55:11.297005 kernel: NET: Registered PF_INET protocol family Mar 7 00:55:11.297025 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 00:55:11.297044 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 00:55:11.297063 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 00:55:11.297083 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 00:55:11.297102 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 00:55:11.297121 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 00:55:11.297145 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:55:11.299226 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:55:11.299267 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 00:55:11.299289 kernel: PCI: CLS 0 bytes, default 64 Mar 7 00:55:11.299310 kernel: kvm [1]: HYP mode not available Mar 7 00:55:11.299331 kernel: Initialise system trusted keyrings Mar 7 00:55:11.299351 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 00:55:11.299370 kernel: Key type asymmetric registered Mar 7 00:55:11.299389 kernel: Asymmetric key parser 'x509' registered Mar 7 00:55:11.299422 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 7 00:55:11.299442 kernel: io scheduler mq-deadline registered Mar 7 00:55:11.299462 kernel: io scheduler kyber registered Mar 7 00:55:11.299480 kernel: io scheduler bfq registered Mar 7 00:55:11.299779 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 7 00:55:11.299841 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 7 00:55:11.299880 kernel: ACPI: button: Power Button [PWRB] Mar 7 00:55:11.299922 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 7 00:55:11.299979 kernel: ACPI: button: Sleep Button [SLPB] Mar 7 00:55:11.300020 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 00:55:11.300088 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 7 00:55:11.300371 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 7 00:55:11.300402 kernel: printk: console [ttyS0] disabled Mar 7 00:55:11.300423 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 7 00:55:11.300443 kernel: printk: console [ttyS0] enabled Mar 7 00:55:11.300462 kernel: printk: bootconsole [uart0] disabled Mar 7 00:55:11.300481 kernel: thunder_xcv, ver 1.0 Mar 7 00:55:11.300500 kernel: thunder_bgx, ver 1.0 Mar 7 00:55:11.300528 kernel: nicpf, ver 1.0 Mar 7 00:55:11.300547 kernel: nicvf, ver 1.0 Mar 7 00:55:11.300778 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 00:55:11.300984 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T00:55:10 UTC (1772844910) Mar 7 00:55:11.301011 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 00:55:11.301032 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 7 00:55:11.301051 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 7 00:55:11.301070 kernel: watchdog: Hard watchdog permanently disabled Mar 7 00:55:11.301095 kernel: NET: Registered PF_INET6 protocol family Mar 7 00:55:11.301114 kernel: Segment Routing with IPv6 Mar 7 00:55:11.301133 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 00:55:11.301152 kernel: NET: Registered PF_PACKET protocol family Mar 7 00:55:11.301206 kernel: Key type dns_resolver registered Mar 7 00:55:11.301228 kernel: registered taskstats version 1 Mar 7 00:55:11.301247 kernel: Loading compiled-in X.509 certificates Mar 7 00:55:11.301267 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: e62b4e4ebcb406beff1271ecc7444548c4ab67e9' Mar 7 00:55:11.301286 kernel: Key type .fscrypt registered Mar 7 00:55:11.301310 kernel: Key type fscrypt-provisioning registered Mar 7 00:55:11.301329 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 00:55:11.301348 kernel: ima: Allocated hash algorithm: sha1 Mar 7 00:55:11.301366 kernel: ima: No architecture policies found Mar 7 00:55:11.301385 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 00:55:11.301404 kernel: clk: Disabling unused clocks Mar 7 00:55:11.301423 kernel: Freeing unused kernel memory: 39424K Mar 7 00:55:11.301442 kernel: Run /init as init process Mar 7 00:55:11.301460 kernel: with arguments: Mar 7 00:55:11.301483 kernel: /init Mar 7 00:55:11.301502 kernel: with environment: Mar 7 00:55:11.301521 kernel: HOME=/ Mar 7 00:55:11.301540 kernel: TERM=linux Mar 7 00:55:11.301563 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:55:11.301587 systemd[1]: Detected virtualization amazon. Mar 7 00:55:11.301608 systemd[1]: Detected architecture arm64. Mar 7 00:55:11.301628 systemd[1]: Running in initrd. Mar 7 00:55:11.301653 systemd[1]: No hostname configured, using default hostname. Mar 7 00:55:11.301673 systemd[1]: Hostname set to . Mar 7 00:55:11.301694 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:55:11.301714 systemd[1]: Queued start job for default target initrd.target. Mar 7 00:55:11.301735 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:55:11.301755 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:55:11.301777 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 00:55:11.301798 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:55:11.301824 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 00:55:11.301845 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 00:55:11.301869 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 00:55:11.301890 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 00:55:11.301910 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:55:11.301931 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:55:11.301956 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:55:11.301976 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:55:11.301997 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:55:11.302018 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:55:11.302038 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:55:11.302059 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:55:11.302080 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 00:55:11.302101 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 00:55:11.302122 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:55:11.302147 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:55:11.302190 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:55:11.302214 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:55:11.302235 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 00:55:11.302256 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:55:11.302276 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 00:55:11.302297 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 00:55:11.302317 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:55:11.302338 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:55:11.302365 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:55:11.302385 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 00:55:11.302406 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:55:11.302466 systemd-journald[252]: Collecting audit messages is disabled. Mar 7 00:55:11.302517 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 00:55:11.302540 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:55:11.302561 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 00:55:11.302581 systemd-journald[252]: Journal started Mar 7 00:55:11.302622 systemd-journald[252]: Runtime Journal (/run/log/journal/ec22dde7d0dc3c597d3acf34c2507ca9) is 8.0M, max 75.3M, 67.3M free. Mar 7 00:55:11.308352 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:55:11.308423 kernel: Bridge firewalling registered Mar 7 00:55:11.260511 systemd-modules-load[253]: Inserted module 'overlay' Mar 7 00:55:11.307324 systemd-modules-load[253]: Inserted module 'br_netfilter' Mar 7 00:55:11.323895 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:55:11.326368 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:55:11.330596 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:55:11.347623 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:55:11.359934 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:55:11.363465 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:55:11.371288 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:55:11.410732 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:55:11.431251 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:55:11.441061 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:55:11.447828 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:55:11.463497 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 00:55:11.471471 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:55:11.508345 dracut-cmdline[289]: dracut-dracut-053 Mar 7 00:55:11.519674 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:55:11.562544 systemd-resolved[290]: Positive Trust Anchors: Mar 7 00:55:11.562581 systemd-resolved[290]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:55:11.562643 systemd-resolved[290]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:55:11.673209 kernel: SCSI subsystem initialized Mar 7 00:55:11.682195 kernel: Loading iSCSI transport class v2.0-870. Mar 7 00:55:11.694205 kernel: iscsi: registered transport (tcp) Mar 7 00:55:11.716438 kernel: iscsi: registered transport (qla4xxx) Mar 7 00:55:11.716530 kernel: QLogic iSCSI HBA Driver Mar 7 00:55:11.795191 kernel: random: crng init done Mar 7 00:55:11.795514 systemd-resolved[290]: Defaulting to hostname 'linux'. Mar 7 00:55:11.800559 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:55:11.805896 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:55:11.832242 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 00:55:11.842435 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 00:55:11.887757 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 00:55:11.887856 kernel: device-mapper: uevent: version 1.0.3 Mar 7 00:55:11.889730 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 00:55:11.956221 kernel: raid6: neonx8 gen() 6706 MB/s Mar 7 00:55:11.973209 kernel: raid6: neonx4 gen() 6549 MB/s Mar 7 00:55:11.990212 kernel: raid6: neonx2 gen() 5400 MB/s Mar 7 00:55:12.007224 kernel: raid6: neonx1 gen() 3903 MB/s Mar 7 00:55:12.024217 kernel: raid6: int64x8 gen() 3766 MB/s Mar 7 00:55:12.041210 kernel: raid6: int64x4 gen() 3700 MB/s Mar 7 00:55:12.058209 kernel: raid6: int64x2 gen() 3591 MB/s Mar 7 00:55:12.076246 kernel: raid6: int64x1 gen() 2736 MB/s Mar 7 00:55:12.076295 kernel: raid6: using algorithm neonx8 gen() 6706 MB/s Mar 7 00:55:12.095239 kernel: raid6: .... xor() 4713 MB/s, rmw enabled Mar 7 00:55:12.095296 kernel: raid6: using neon recovery algorithm Mar 7 00:55:12.104518 kernel: xor: measuring software checksum speed Mar 7 00:55:12.104588 kernel: 8regs : 10964 MB/sec Mar 7 00:55:12.105725 kernel: 32regs : 11895 MB/sec Mar 7 00:55:12.107016 kernel: arm64_neon : 9289 MB/sec Mar 7 00:55:12.107050 kernel: xor: using function: 32regs (11895 MB/sec) Mar 7 00:55:12.193235 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 00:55:12.212948 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:55:12.225534 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:55:12.269566 systemd-udevd[473]: Using default interface naming scheme 'v255'. Mar 7 00:55:12.278917 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:55:12.301716 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 00:55:12.333934 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation Mar 7 00:55:12.396007 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:55:12.408650 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:55:12.523551 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:55:12.545538 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 00:55:12.589746 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 00:55:12.595909 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:55:12.606369 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:55:12.622974 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:55:12.636457 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 00:55:12.684524 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:55:12.746231 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 7 00:55:12.746307 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 7 00:55:12.752247 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 7 00:55:12.752625 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 7 00:55:12.759205 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80110000, mac addr 06:d7:31:f3:09:4f Mar 7 00:55:12.766935 (udev-worker)[520]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:55:12.775930 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:55:12.776330 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:55:12.785957 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:55:12.789399 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:55:12.789690 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:55:12.793145 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:55:12.822235 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 7 00:55:12.822578 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:55:12.835117 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 7 00:55:12.846215 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 7 00:55:12.852947 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 7 00:55:12.853018 kernel: GPT:9289727 != 33554431 Mar 7 00:55:12.853044 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 7 00:55:12.855505 kernel: GPT:9289727 != 33554431 Mar 7 00:55:12.859072 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 7 00:55:12.859128 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:55:12.862189 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:55:12.875515 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:55:12.914594 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:55:12.983233 kernel: BTRFS: device fsid 237c8587-8110-47ef-99f9-37e4ed4d3b31 devid 1 transid 36 /dev/nvme0n1p3 scanned by (udev-worker) (533) Mar 7 00:55:13.022229 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/nvme0n1p6 scanned by (udev-worker) (531) Mar 7 00:55:13.049384 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Mar 7 00:55:13.102411 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Mar 7 00:55:13.120681 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Mar 7 00:55:13.125407 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Mar 7 00:55:13.140698 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 7 00:55:13.153459 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 00:55:13.175223 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:55:13.175418 disk-uuid[664]: Primary Header is updated. Mar 7 00:55:13.175418 disk-uuid[664]: Secondary Entries is updated. Mar 7 00:55:13.175418 disk-uuid[664]: Secondary Header is updated. Mar 7 00:55:14.212197 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 7 00:55:14.212807 disk-uuid[665]: The operation has completed successfully. Mar 7 00:55:14.399507 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 00:55:14.399713 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 00:55:14.455443 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 00:55:14.469679 sh[1010]: Success Mar 7 00:55:14.491225 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 7 00:55:14.598339 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 00:55:14.605361 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 00:55:14.614681 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 00:55:14.656901 kernel: BTRFS info (device dm-0): first mount of filesystem 237c8587-8110-47ef-99f9-37e4ed4d3b31 Mar 7 00:55:14.656969 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:55:14.657008 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 00:55:14.658903 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 00:55:14.660355 kernel: BTRFS info (device dm-0): using free space tree Mar 7 00:55:14.736215 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 7 00:55:14.751564 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 00:55:14.762954 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 00:55:14.773479 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 00:55:14.789559 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 00:55:14.824388 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:55:14.824461 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:55:14.826428 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:55:14.842326 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:55:14.861393 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 00:55:14.867458 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:55:14.878773 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 00:55:14.889638 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 00:55:14.984677 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:55:14.999588 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:55:15.049426 systemd-networkd[1203]: lo: Link UP Mar 7 00:55:15.049449 systemd-networkd[1203]: lo: Gained carrier Mar 7 00:55:15.052418 systemd-networkd[1203]: Enumeration completed Mar 7 00:55:15.052746 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:55:15.054099 systemd-networkd[1203]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:55:15.054107 systemd-networkd[1203]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:55:15.057951 systemd[1]: Reached target network.target - Network. Mar 7 00:55:15.064281 systemd-networkd[1203]: eth0: Link UP Mar 7 00:55:15.064290 systemd-networkd[1203]: eth0: Gained carrier Mar 7 00:55:15.064309 systemd-networkd[1203]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:55:15.101268 systemd-networkd[1203]: eth0: DHCPv4 address 172.31.19.200/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 7 00:55:15.320287 ignition[1137]: Ignition 2.19.0 Mar 7 00:55:15.320323 ignition[1137]: Stage: fetch-offline Mar 7 00:55:15.324581 ignition[1137]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:15.324621 ignition[1137]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:15.327856 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:55:15.325348 ignition[1137]: Ignition finished successfully Mar 7 00:55:15.347533 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 00:55:15.382506 ignition[1213]: Ignition 2.19.0 Mar 7 00:55:15.383365 ignition[1213]: Stage: fetch Mar 7 00:55:15.384018 ignition[1213]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:15.384043 ignition[1213]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:15.386633 ignition[1213]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:15.402792 ignition[1213]: PUT result: OK Mar 7 00:55:15.406217 ignition[1213]: parsed url from cmdline: "" Mar 7 00:55:15.406233 ignition[1213]: no config URL provided Mar 7 00:55:15.406248 ignition[1213]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:55:15.406274 ignition[1213]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:55:15.406309 ignition[1213]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:15.413670 ignition[1213]: PUT result: OK Mar 7 00:55:15.413748 ignition[1213]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 7 00:55:15.420912 ignition[1213]: GET result: OK Mar 7 00:55:15.421155 ignition[1213]: parsing config with SHA512: 5ba8167670635ea6622be9c7ae130183dfd6ee9062379d15528ec4c91f5424b3a81bf7794cbca138ed2a530d3102a1157301f5416adda7c5dc0fae0ad63aeb63 Mar 7 00:55:15.430953 unknown[1213]: fetched base config from "system" Mar 7 00:55:15.431963 unknown[1213]: fetched base config from "system" Mar 7 00:55:15.432679 ignition[1213]: fetch: fetch complete Mar 7 00:55:15.431978 unknown[1213]: fetched user config from "aws" Mar 7 00:55:15.432693 ignition[1213]: fetch: fetch passed Mar 7 00:55:15.432813 ignition[1213]: Ignition finished successfully Mar 7 00:55:15.444952 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 00:55:15.458707 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 00:55:15.491995 ignition[1219]: Ignition 2.19.0 Mar 7 00:55:15.492023 ignition[1219]: Stage: kargs Mar 7 00:55:15.492759 ignition[1219]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:15.492787 ignition[1219]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:15.492948 ignition[1219]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:15.498153 ignition[1219]: PUT result: OK Mar 7 00:55:15.510122 ignition[1219]: kargs: kargs passed Mar 7 00:55:15.510297 ignition[1219]: Ignition finished successfully Mar 7 00:55:15.517136 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 00:55:15.533095 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 00:55:15.560036 ignition[1226]: Ignition 2.19.0 Mar 7 00:55:15.560056 ignition[1226]: Stage: disks Mar 7 00:55:15.560702 ignition[1226]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:15.560728 ignition[1226]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:15.560884 ignition[1226]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:15.569097 ignition[1226]: PUT result: OK Mar 7 00:55:15.578805 ignition[1226]: disks: disks passed Mar 7 00:55:15.579845 ignition[1226]: Ignition finished successfully Mar 7 00:55:15.586325 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 00:55:15.592658 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 00:55:15.595504 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 00:55:15.598669 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:55:15.601716 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:55:15.604387 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:55:15.626764 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 00:55:15.675480 systemd-fsck[1234]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 7 00:55:15.681248 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 00:55:15.690411 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 00:55:15.790649 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 596a8ea8-9d3d-4d06-a56e-9d3ebd3cb76d r/w with ordered data mode. Quota mode: none. Mar 7 00:55:15.791629 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 00:55:15.792935 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 00:55:15.810449 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:55:15.814340 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 00:55:15.825767 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 7 00:55:15.825869 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 00:55:15.825921 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:55:15.860074 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/nvme0n1p6 scanned by mount (1253) Mar 7 00:55:15.860120 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:55:15.860148 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:55:15.860197 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:55:15.840872 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 00:55:15.863567 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 00:55:15.887216 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:55:15.890003 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:55:16.257301 initrd-setup-root[1277]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 00:55:16.289677 initrd-setup-root[1284]: cut: /sysroot/etc/group: No such file or directory Mar 7 00:55:16.299869 initrd-setup-root[1291]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 00:55:16.309394 initrd-setup-root[1298]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 00:55:16.664493 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 00:55:16.680399 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 00:55:16.686419 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 00:55:16.709731 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 00:55:16.712792 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:55:16.749282 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 00:55:16.767737 ignition[1365]: INFO : Ignition 2.19.0 Mar 7 00:55:16.767737 ignition[1365]: INFO : Stage: mount Mar 7 00:55:16.772788 ignition[1365]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:16.772788 ignition[1365]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:16.772788 ignition[1365]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:16.781068 ignition[1365]: INFO : PUT result: OK Mar 7 00:55:16.786005 ignition[1365]: INFO : mount: mount passed Mar 7 00:55:16.789470 ignition[1365]: INFO : Ignition finished successfully Mar 7 00:55:16.793190 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 00:55:16.806349 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 00:55:16.828666 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:55:16.867192 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 scanned by mount (1379) Mar 7 00:55:16.867255 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:55:16.869149 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:55:16.870438 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 7 00:55:16.876206 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 7 00:55:16.879931 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:55:16.903366 systemd-networkd[1203]: eth0: Gained IPv6LL Mar 7 00:55:16.926469 ignition[1396]: INFO : Ignition 2.19.0 Mar 7 00:55:16.928767 ignition[1396]: INFO : Stage: files Mar 7 00:55:16.928767 ignition[1396]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:16.928767 ignition[1396]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:16.928767 ignition[1396]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:16.940421 ignition[1396]: INFO : PUT result: OK Mar 7 00:55:16.945782 ignition[1396]: DEBUG : files: compiled without relabeling support, skipping Mar 7 00:55:16.958724 ignition[1396]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 00:55:16.958724 ignition[1396]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 00:55:17.008419 ignition[1396]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 00:55:17.012146 ignition[1396]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 00:55:17.016255 unknown[1396]: wrote ssh authorized keys file for user: core Mar 7 00:55:17.019343 ignition[1396]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 00:55:17.023105 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:55:17.023105 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 00:55:17.107154 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 00:55:17.250571 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 7 00:55:17.258159 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-arm64.raw: attempt #1 Mar 7 00:55:17.597271 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 00:55:18.050413 ignition[1396]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 7 00:55:18.050413 ignition[1396]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 00:55:18.063234 ignition[1396]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:55:18.068603 ignition[1396]: INFO : files: files passed Mar 7 00:55:18.068603 ignition[1396]: INFO : Ignition finished successfully Mar 7 00:55:18.074230 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 00:55:18.109833 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 00:55:18.122709 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 00:55:18.132534 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 00:55:18.133882 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 00:55:18.168212 initrd-setup-root-after-ignition[1425]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:55:18.168212 initrd-setup-root-after-ignition[1425]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:55:18.180243 initrd-setup-root-after-ignition[1429]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:55:18.176866 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:55:18.182639 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 00:55:18.198122 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 00:55:18.259136 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 00:55:18.259392 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 00:55:18.271814 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 00:55:18.274336 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 00:55:18.277264 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 00:55:18.292479 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 00:55:18.326372 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:55:18.340502 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 00:55:18.367028 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:55:18.370061 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:55:18.373547 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 00:55:18.375926 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 00:55:18.376189 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:55:18.380664 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 00:55:18.384785 systemd[1]: Stopped target basic.target - Basic System. Mar 7 00:55:18.390255 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 00:55:18.394756 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:55:18.399927 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 00:55:18.405938 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 00:55:18.411283 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:55:18.416144 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 00:55:18.421984 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 00:55:18.427684 systemd[1]: Stopped target swap.target - Swaps. Mar 7 00:55:18.432947 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 00:55:18.433245 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:55:18.443327 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:55:18.448534 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:55:18.451607 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 00:55:18.451922 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:55:18.452587 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 00:55:18.452858 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 00:55:18.462851 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 00:55:18.463104 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:55:18.465653 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 00:55:18.465857 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 00:55:18.495612 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 00:55:18.501669 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 00:55:18.516587 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 00:55:18.517869 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:55:18.530494 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 00:55:18.530844 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:55:18.556383 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 00:55:18.556602 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 00:55:18.573810 ignition[1449]: INFO : Ignition 2.19.0 Mar 7 00:55:18.573810 ignition[1449]: INFO : Stage: umount Mar 7 00:55:18.579697 ignition[1449]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:55:18.582694 ignition[1449]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 7 00:55:18.582694 ignition[1449]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 7 00:55:18.592289 ignition[1449]: INFO : PUT result: OK Mar 7 00:55:18.599386 ignition[1449]: INFO : umount: umount passed Mar 7 00:55:18.599386 ignition[1449]: INFO : Ignition finished successfully Mar 7 00:55:18.594952 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 00:55:18.606663 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 00:55:18.609260 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 00:55:18.612352 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 00:55:18.612529 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 00:55:18.626072 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 00:55:18.626280 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 00:55:18.629449 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 00:55:18.629555 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 00:55:18.632460 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 00:55:18.632546 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 00:55:18.635132 systemd[1]: Stopped target network.target - Network. Mar 7 00:55:18.638146 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 00:55:18.638256 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:55:18.648481 systemd[1]: Stopped target paths.target - Path Units. Mar 7 00:55:18.648784 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 00:55:18.654706 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:55:18.675677 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 00:55:18.679230 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 00:55:18.681702 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 00:55:18.681782 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:55:18.684396 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 00:55:18.684474 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:55:18.689312 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 00:55:18.689431 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 00:55:18.692308 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 00:55:18.693413 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 00:55:18.696696 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 00:55:18.696807 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 00:55:18.701994 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 00:55:18.711148 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 00:55:18.716309 systemd-networkd[1203]: eth0: DHCPv6 lease lost Mar 7 00:55:18.727622 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 00:55:18.727890 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 00:55:18.742141 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 00:55:18.742372 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 00:55:18.748229 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 00:55:18.748357 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:55:18.780975 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 00:55:18.783769 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 00:55:18.783894 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:55:18.787275 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 00:55:18.787370 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:55:18.790070 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 00:55:18.790191 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 00:55:18.792967 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 00:55:18.793056 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:55:18.798602 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:55:18.851812 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 00:55:18.853127 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:55:18.859085 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 00:55:18.859313 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 00:55:18.865802 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 00:55:18.865949 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 00:55:18.868336 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 00:55:18.868421 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:55:18.868627 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 00:55:18.868722 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:55:18.869903 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 00:55:18.869992 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 00:55:18.873503 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:55:18.873598 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:55:18.913015 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 00:55:18.922777 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 00:55:18.923387 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:55:18.925158 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 7 00:55:18.925267 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:55:18.925927 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 00:55:18.926004 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:55:18.930871 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:55:18.930980 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:55:18.973659 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 00:55:18.977275 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 00:55:18.983602 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 00:55:18.997607 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 00:55:19.018795 systemd[1]: Switching root. Mar 7 00:55:19.073708 systemd-journald[252]: Journal stopped Mar 7 00:55:21.580992 systemd-journald[252]: Received SIGTERM from PID 1 (systemd). Mar 7 00:55:21.581149 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 00:55:21.581228 kernel: SELinux: policy capability open_perms=1 Mar 7 00:55:21.581263 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 00:55:21.581294 kernel: SELinux: policy capability always_check_network=0 Mar 7 00:55:21.581324 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 00:55:21.581354 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 00:55:21.581385 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 00:55:21.581417 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 00:55:21.581455 kernel: audit: type=1403 audit(1772844919.597:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 00:55:21.581490 systemd[1]: Successfully loaded SELinux policy in 83.945ms. Mar 7 00:55:21.581537 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.057ms. Mar 7 00:55:21.581573 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:55:21.581606 systemd[1]: Detected virtualization amazon. Mar 7 00:55:21.581639 systemd[1]: Detected architecture arm64. Mar 7 00:55:21.581681 systemd[1]: Detected first boot. Mar 7 00:55:21.581710 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:55:21.581742 zram_generator::config[1492]: No configuration found. Mar 7 00:55:21.581785 systemd[1]: Populated /etc with preset unit settings. Mar 7 00:55:21.581819 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 00:55:21.581851 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 00:55:21.581903 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 00:55:21.581940 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 00:55:21.581971 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 00:55:21.582001 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 00:55:21.582031 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 00:55:21.582066 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 00:55:21.582103 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 00:55:21.582135 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 00:55:21.582964 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 00:55:21.583034 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:55:21.583075 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:55:21.583107 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 00:55:21.583143 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 00:55:21.583219 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 00:55:21.583254 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:55:21.583301 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 7 00:55:21.583334 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:55:21.583367 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 00:55:21.583400 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 00:55:21.583432 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 00:55:21.583465 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 00:55:21.583500 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:55:21.583534 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:55:21.583567 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:55:21.583599 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:55:21.583631 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 00:55:21.583660 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 00:55:21.583692 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:55:21.583723 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:55:21.583759 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:55:21.583789 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 00:55:21.583824 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 00:55:21.583857 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 00:55:21.583908 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 00:55:21.583946 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 00:55:21.583981 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 00:55:21.584014 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 00:55:21.584046 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 00:55:21.584076 systemd[1]: Reached target machines.target - Containers. Mar 7 00:55:21.584111 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 00:55:21.584155 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:55:21.584214 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:55:21.584246 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 00:55:21.584276 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:55:21.584306 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:55:21.584336 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:55:21.584366 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 00:55:21.584401 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:55:21.584433 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 00:55:21.584463 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 00:55:21.584493 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 00:55:21.584523 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 00:55:21.584552 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 00:55:21.584595 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:55:21.584624 kernel: loop: module loaded Mar 7 00:55:21.584657 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:55:21.584691 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 00:55:21.584721 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 00:55:21.584752 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:55:21.584784 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 00:55:21.584816 systemd[1]: Stopped verity-setup.service. Mar 7 00:55:21.584845 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 00:55:21.584873 kernel: fuse: init (API version 7.39) Mar 7 00:55:21.584902 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 00:55:21.584933 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 00:55:21.584976 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 00:55:21.585009 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 00:55:21.585041 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 00:55:21.585071 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:55:21.585100 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 00:55:21.585135 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 00:55:21.585236 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:55:21.585277 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:55:21.585311 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:55:21.585341 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:55:21.585371 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 00:55:21.585401 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 00:55:21.585434 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:55:21.585466 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:55:21.585503 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:55:21.585540 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:55:21.585621 systemd-journald[1578]: Collecting audit messages is disabled. Mar 7 00:55:21.585673 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 00:55:21.585711 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 00:55:21.585745 kernel: ACPI: bus type drm_connector registered Mar 7 00:55:21.585774 systemd-journald[1578]: Journal started Mar 7 00:55:21.585824 systemd-journald[1578]: Runtime Journal (/run/log/journal/ec22dde7d0dc3c597d3acf34c2507ca9) is 8.0M, max 75.3M, 67.3M free. Mar 7 00:55:20.843968 systemd[1]: Queued start job for default target multi-user.target. Mar 7 00:55:20.907251 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Mar 7 00:55:20.908106 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 00:55:21.599997 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 00:55:21.613945 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 00:55:21.614036 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 00:55:21.622906 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:55:21.634061 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 00:55:21.647318 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 00:55:21.667289 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 00:55:21.672214 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:55:21.683635 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 00:55:21.683722 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:55:21.706695 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 00:55:21.711212 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:55:21.728052 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:55:21.744330 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 00:55:21.754292 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:55:21.768215 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:55:21.770439 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 00:55:21.776243 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:55:21.776613 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:55:21.781629 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 00:55:21.782689 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 00:55:21.785298 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 00:55:21.821611 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 00:55:21.879778 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 00:55:21.883433 kernel: loop0: detected capacity change from 0 to 114432 Mar 7 00:55:21.897521 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 00:55:21.913622 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 00:55:21.919401 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:55:21.942326 systemd-tmpfiles[1605]: ACLs are not supported, ignoring. Mar 7 00:55:21.942368 systemd-tmpfiles[1605]: ACLs are not supported, ignoring. Mar 7 00:55:21.967275 systemd-journald[1578]: Time spent on flushing to /var/log/journal/ec22dde7d0dc3c597d3acf34c2507ca9 is 67.102ms for 906 entries. Mar 7 00:55:21.967275 systemd-journald[1578]: System Journal (/var/log/journal/ec22dde7d0dc3c597d3acf34c2507ca9) is 8.0M, max 195.6M, 187.6M free. Mar 7 00:55:22.062596 systemd-journald[1578]: Received client request to flush runtime journal. Mar 7 00:55:22.062682 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 00:55:22.062719 kernel: loop1: detected capacity change from 0 to 209336 Mar 7 00:55:21.973369 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 00:55:21.979398 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:55:21.988472 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 00:55:21.994418 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:55:22.017517 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 00:55:22.028408 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 00:55:22.073765 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 00:55:22.098815 udevadm[1639]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 7 00:55:22.133222 kernel: loop2: detected capacity change from 0 to 114328 Mar 7 00:55:22.143402 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 00:55:22.158687 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:55:22.197074 systemd-tmpfiles[1645]: ACLs are not supported, ignoring. Mar 7 00:55:22.197612 systemd-tmpfiles[1645]: ACLs are not supported, ignoring. Mar 7 00:55:22.207830 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:55:22.266885 kernel: loop3: detected capacity change from 0 to 52536 Mar 7 00:55:22.316214 kernel: loop4: detected capacity change from 0 to 114432 Mar 7 00:55:22.331206 kernel: loop5: detected capacity change from 0 to 209336 Mar 7 00:55:22.362611 kernel: loop6: detected capacity change from 0 to 114328 Mar 7 00:55:22.376202 kernel: loop7: detected capacity change from 0 to 52536 Mar 7 00:55:22.393749 (sd-merge)[1650]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Mar 7 00:55:22.394769 (sd-merge)[1650]: Merged extensions into '/usr'. Mar 7 00:55:22.407356 systemd[1]: Reloading requested from client PID 1604 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 00:55:22.407394 systemd[1]: Reloading... Mar 7 00:55:22.585987 zram_generator::config[1676]: No configuration found. Mar 7 00:55:22.881591 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:55:23.001259 systemd[1]: Reloading finished in 592 ms. Mar 7 00:55:23.059264 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 00:55:23.078645 systemd[1]: Starting ensure-sysext.service... Mar 7 00:55:23.100603 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:55:23.127966 systemd[1]: Reloading requested from client PID 1727 ('systemctl') (unit ensure-sysext.service)... Mar 7 00:55:23.128007 systemd[1]: Reloading... Mar 7 00:55:23.178573 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 00:55:23.179303 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 00:55:23.183124 systemd-tmpfiles[1728]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 00:55:23.183750 systemd-tmpfiles[1728]: ACLs are not supported, ignoring. Mar 7 00:55:23.183935 systemd-tmpfiles[1728]: ACLs are not supported, ignoring. Mar 7 00:55:23.191767 systemd-tmpfiles[1728]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:55:23.191794 systemd-tmpfiles[1728]: Skipping /boot Mar 7 00:55:23.215423 systemd-tmpfiles[1728]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:55:23.215445 systemd-tmpfiles[1728]: Skipping /boot Mar 7 00:55:23.325224 zram_generator::config[1756]: No configuration found. Mar 7 00:55:23.456555 ldconfig[1600]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 00:55:23.570443 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:55:23.684705 systemd[1]: Reloading finished in 556 ms. Mar 7 00:55:23.713223 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 00:55:23.716481 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 00:55:23.731061 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:55:23.753521 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:55:23.769465 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 00:55:23.787489 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 00:55:23.805635 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:55:23.821597 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:55:23.831528 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 00:55:23.845202 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:55:23.856315 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:55:23.864658 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:55:23.874574 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:55:23.878080 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:55:23.902503 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 00:55:23.914794 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:55:23.915846 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:55:23.921995 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 00:55:23.939002 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:55:23.941296 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:55:23.952074 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:55:23.965699 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:55:23.969937 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:55:23.970474 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:55:23.970794 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 00:55:23.978380 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:55:23.980286 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:55:23.990701 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:55:23.993343 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:55:24.007348 systemd[1]: Finished ensure-sysext.service. Mar 7 00:55:24.021041 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:55:24.030953 systemd-udevd[1822]: Using default interface naming scheme 'v255'. Mar 7 00:55:24.042526 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 00:55:24.050654 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:55:24.051851 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:55:24.070493 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 00:55:24.095035 augenrules[1846]: No rules Mar 7 00:55:24.101934 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:55:24.121119 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 00:55:24.134080 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 00:55:24.145075 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 00:55:24.153596 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 00:55:24.158282 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:55:24.180858 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:55:24.319132 systemd-resolved[1819]: Positive Trust Anchors: Mar 7 00:55:24.320694 systemd-resolved[1819]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:55:24.320898 systemd-resolved[1819]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:55:24.338199 systemd-networkd[1864]: lo: Link UP Mar 7 00:55:24.338212 systemd-networkd[1864]: lo: Gained carrier Mar 7 00:55:24.342211 systemd-networkd[1864]: Enumeration completed Mar 7 00:55:24.342390 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:55:24.353327 systemd-resolved[1819]: Defaulting to hostname 'linux'. Mar 7 00:55:24.381495 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 00:55:24.389195 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:55:24.398832 systemd[1]: Reached target network.target - Network. Mar 7 00:55:24.404328 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:55:24.419222 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 7 00:55:24.426056 (udev-worker)[1870]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:55:24.487380 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (1877) Mar 7 00:55:24.559148 systemd-networkd[1864]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:55:24.559194 systemd-networkd[1864]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:55:24.562142 systemd-networkd[1864]: eth0: Link UP Mar 7 00:55:24.564648 systemd-networkd[1864]: eth0: Gained carrier Mar 7 00:55:24.564698 systemd-networkd[1864]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:55:24.609423 systemd-networkd[1864]: eth0: DHCPv4 address 172.31.19.200/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 7 00:55:24.791901 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:55:24.855524 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 7 00:55:24.866558 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 00:55:24.870194 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 00:55:24.875488 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 00:55:24.907130 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 00:55:24.928210 lvm[1980]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:55:24.970872 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:55:24.974658 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 00:55:24.980708 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:55:24.983939 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:55:24.987082 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 00:55:24.990673 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 00:55:24.994780 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 00:55:24.998108 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 00:55:25.001698 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 00:55:25.005287 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 00:55:25.005351 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:55:25.007777 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:55:25.011366 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 00:55:25.017439 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 00:55:25.029829 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 00:55:25.034813 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 00:55:25.041304 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 00:55:25.044617 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:55:25.048548 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:55:25.051319 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:55:25.051957 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:55:25.059420 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 00:55:25.072507 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 00:55:25.075191 lvm[1989]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:55:25.087510 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 00:55:25.100420 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 00:55:25.114324 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 00:55:25.118896 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 00:55:25.128960 jq[1993]: false Mar 7 00:55:25.135650 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 00:55:25.148532 systemd[1]: Started ntpd.service - Network Time Service. Mar 7 00:55:25.157519 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 00:55:25.170315 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 7 00:55:25.178543 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 00:55:25.189550 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 00:55:25.208880 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 00:55:25.213858 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 00:55:25.214714 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 00:55:25.220504 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 00:55:25.246595 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 00:55:25.255299 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 00:55:25.266924 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 00:55:25.268151 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 00:55:25.317392 extend-filesystems[1994]: Found loop4 Mar 7 00:55:25.317392 extend-filesystems[1994]: Found loop5 Mar 7 00:55:25.317392 extend-filesystems[1994]: Found loop6 Mar 7 00:55:25.317392 extend-filesystems[1994]: Found loop7 Mar 7 00:55:25.317392 extend-filesystems[1994]: Found nvme0n1 Mar 7 00:55:25.317392 extend-filesystems[1994]: Found nvme0n1p1 Mar 7 00:55:25.302508 (ntainerd)[2014]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p2 Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p3 Mar 7 00:55:25.378233 extend-filesystems[1994]: Found usr Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p4 Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p6 Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p7 Mar 7 00:55:25.378233 extend-filesystems[1994]: Found nvme0n1p9 Mar 7 00:55:25.378233 extend-filesystems[1994]: Checking size of /dev/nvme0n1p9 Mar 7 00:55:25.463537 jq[2007]: true Mar 7 00:55:25.324751 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: ntpd 4.2.8p17@1.4004-o Fri Mar 6 22:14:43 UTC 2026 (1): Starting Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: ---------------------------------------------------- Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: ntp-4 is maintained by Network Time Foundation, Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: corporation. Support and training for ntp-4 are Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: available at https://www.nwtime.org/support Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: ---------------------------------------------------- Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: proto: precision = 0.096 usec (-23) Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: basedate set to 2026-02-22 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: gps base set to 2026-02-22 (week 2407) Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listen and drop on 0 v6wildcard [::]:123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listen normally on 2 lo 127.0.0.1:123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listen normally on 3 eth0 172.31.19.200:123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listen normally on 4 lo [::1]:123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: bind(21) AF_INET6 fe80::4d7:31ff:fef3:94f%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: unable to create socket on eth0 (5) for fe80::4d7:31ff:fef3:94f%2#123 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: failed to init interface for address fe80::4d7:31ff:fef3:94f%2 Mar 7 00:55:25.472252 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: Listening on routing socket on fd #21 for interface updates Mar 7 00:55:25.391157 dbus-daemon[1992]: [system] SELinux support is enabled Mar 7 00:55:25.326297 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 00:55:25.528345 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:55:25.528345 ntpd[1996]: 7 Mar 00:55:25 ntpd[1996]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:55:25.410989 ntpd[1996]: ntpd 4.2.8p17@1.4004-o Fri Mar 6 22:14:43 UTC 2026 (1): Starting Mar 7 00:55:25.391496 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 00:55:25.529563 extend-filesystems[1994]: Resized partition /dev/nvme0n1p9 Mar 7 00:55:25.411043 ntpd[1996]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 7 00:55:25.413120 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 00:55:25.548288 extend-filesystems[2041]: resize2fs 1.47.1 (20-May-2024) Mar 7 00:55:25.411065 ntpd[1996]: ---------------------------------------------------- Mar 7 00:55:25.415820 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 00:55:25.561928 tar[2018]: linux-arm64/LICENSE Mar 7 00:55:25.561928 tar[2018]: linux-arm64/helm Mar 7 00:55:25.411085 ntpd[1996]: ntp-4 is maintained by Network Time Foundation, Mar 7 00:55:25.439597 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 00:55:25.567029 jq[2025]: true Mar 7 00:55:25.575032 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 3587067 blocks Mar 7 00:55:25.411105 ntpd[1996]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 7 00:55:25.439660 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 00:55:25.578295 update_engine[2005]: I20260307 00:55:25.568120 2005 main.cc:92] Flatcar Update Engine starting Mar 7 00:55:25.411124 ntpd[1996]: corporation. Support and training for ntp-4 are Mar 7 00:55:25.443432 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 00:55:25.411144 ntpd[1996]: available at https://www.nwtime.org/support Mar 7 00:55:25.443475 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 00:55:25.411197 ntpd[1996]: ---------------------------------------------------- Mar 7 00:55:25.483728 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 7 00:55:25.419535 dbus-daemon[1992]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1864 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 7 00:55:25.426894 ntpd[1996]: proto: precision = 0.096 usec (-23) Mar 7 00:55:25.428435 ntpd[1996]: basedate set to 2026-02-22 Mar 7 00:55:25.428474 ntpd[1996]: gps base set to 2026-02-22 (week 2407) Mar 7 00:55:25.434977 ntpd[1996]: Listen and drop on 0 v6wildcard [::]:123 Mar 7 00:55:25.435079 ntpd[1996]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 7 00:55:25.435621 ntpd[1996]: Listen normally on 2 lo 127.0.0.1:123 Mar 7 00:55:25.594509 systemd[1]: Started update-engine.service - Update Engine. Mar 7 00:55:25.437316 ntpd[1996]: Listen normally on 3 eth0 172.31.19.200:123 Mar 7 00:55:25.437399 ntpd[1996]: Listen normally on 4 lo [::1]:123 Mar 7 00:55:25.437484 ntpd[1996]: bind(21) AF_INET6 fe80::4d7:31ff:fef3:94f%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:55:25.437525 ntpd[1996]: unable to create socket on eth0 (5) for fe80::4d7:31ff:fef3:94f%2#123 Mar 7 00:55:25.437554 ntpd[1996]: failed to init interface for address fe80::4d7:31ff:fef3:94f%2 Mar 7 00:55:25.437618 ntpd[1996]: Listening on routing socket on fd #21 for interface updates Mar 7 00:55:25.447100 dbus-daemon[1992]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 7 00:55:25.488747 ntpd[1996]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:55:25.488806 ntpd[1996]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 7 00:55:25.609253 update_engine[2005]: I20260307 00:55:25.608315 2005 update_check_scheduler.cc:74] Next update check in 6m28s Mar 7 00:55:25.618739 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 00:55:25.649903 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 7 00:55:25.745133 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 3587067 Mar 7 00:55:25.770159 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (1870) Mar 7 00:55:25.741460 systemd-logind[2003]: Watching system buttons on /dev/input/event0 (Power Button) Mar 7 00:55:25.741514 systemd-logind[2003]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 7 00:55:25.743897 systemd-logind[2003]: New seat seat0. Mar 7 00:55:25.746311 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.776 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetch failed with 404: resource not found Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.780 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetch successful Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Mar 7 00:55:25.784206 coreos-metadata[1991]: Mar 07 00:55:25.783 INFO Fetch successful Mar 7 00:55:25.785693 extend-filesystems[2041]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 7 00:55:25.785693 extend-filesystems[2041]: old_desc_blocks = 1, new_desc_blocks = 2 Mar 7 00:55:25.785693 extend-filesystems[2041]: The filesystem on /dev/nvme0n1p9 is now 3587067 (4k) blocks long. Mar 7 00:55:25.802732 extend-filesystems[1994]: Resized filesystem in /dev/nvme0n1p9 Mar 7 00:55:25.786925 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 00:55:25.787328 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 00:55:25.879545 bash[2076]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:55:25.888492 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 00:55:25.940146 systemd[1]: Starting sshkeys.service... Mar 7 00:55:25.982684 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 00:55:25.995817 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 00:55:25.999156 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 00:55:26.007487 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 7 00:55:26.020472 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 7 00:55:26.239124 dbus-daemon[1992]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 7 00:55:26.239702 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 7 00:55:26.247996 dbus-daemon[1992]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2039 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 7 00:55:26.277081 containerd[2014]: time="2026-03-07T00:55:26.276932110Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 00:55:26.325491 systemd[1]: Starting polkit.service - Authorization Manager... Mar 7 00:55:26.393557 polkitd[2143]: Started polkitd version 121 Mar 7 00:55:26.396399 containerd[2014]: time="2026-03-07T00:55:26.396327622Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.405954 coreos-metadata[2110]: Mar 07 00:55:26.405 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 7 00:55:26.407856 coreos-metadata[2110]: Mar 07 00:55:26.407 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Mar 7 00:55:26.410825 coreos-metadata[2110]: Mar 07 00:55:26.408 INFO Fetch successful Mar 7 00:55:26.410825 coreos-metadata[2110]: Mar 07 00:55:26.410 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 7 00:55:26.414141 coreos-metadata[2110]: Mar 07 00:55:26.414 INFO Fetch successful Mar 7 00:55:26.414469 ntpd[1996]: bind(24) AF_INET6 fe80::4d7:31ff:fef3:94f%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:55:26.416389 ntpd[1996]: 7 Mar 00:55:26 ntpd[1996]: bind(24) AF_INET6 fe80::4d7:31ff:fef3:94f%2#123 flags 0x11 failed: Cannot assign requested address Mar 7 00:55:26.416389 ntpd[1996]: 7 Mar 00:55:26 ntpd[1996]: unable to create socket on eth0 (6) for fe80::4d7:31ff:fef3:94f%2#123 Mar 7 00:55:26.416389 ntpd[1996]: 7 Mar 00:55:26 ntpd[1996]: failed to init interface for address fe80::4d7:31ff:fef3:94f%2 Mar 7 00:55:26.414540 ntpd[1996]: unable to create socket on eth0 (6) for fe80::4d7:31ff:fef3:94f%2#123 Mar 7 00:55:26.414570 ntpd[1996]: failed to init interface for address fe80::4d7:31ff:fef3:94f%2 Mar 7 00:55:26.420829 unknown[2110]: wrote ssh authorized keys file for user: core Mar 7 00:55:26.423239 containerd[2014]: time="2026-03-07T00:55:26.422677079Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:55:26.423239 containerd[2014]: time="2026-03-07T00:55:26.422748851Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 00:55:26.423239 containerd[2014]: time="2026-03-07T00:55:26.422786255Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 00:55:26.423239 containerd[2014]: time="2026-03-07T00:55:26.423147371Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 00:55:26.423239 containerd[2014]: time="2026-03-07T00:55:26.423223643Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.423610 containerd[2014]: time="2026-03-07T00:55:26.423379463Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:55:26.423610 containerd[2014]: time="2026-03-07T00:55:26.423409751Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.424265 containerd[2014]: time="2026-03-07T00:55:26.423719219Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:55:26.424265 containerd[2014]: time="2026-03-07T00:55:26.423768563Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.424265 containerd[2014]: time="2026-03-07T00:55:26.423803015Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:55:26.424265 containerd[2014]: time="2026-03-07T00:55:26.423828347Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.424265 containerd[2014]: time="2026-03-07T00:55:26.424016819Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.436213 containerd[2014]: time="2026-03-07T00:55:26.435348347Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:55:26.438310 polkitd[2143]: Loading rules from directory /etc/polkit-1/rules.d Mar 7 00:55:26.438427 polkitd[2143]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 7 00:55:26.438993 containerd[2014]: time="2026-03-07T00:55:26.437890331Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:55:26.440083 containerd[2014]: time="2026-03-07T00:55:26.439590971Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 00:55:26.446201 containerd[2014]: time="2026-03-07T00:55:26.445520975Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 00:55:26.446201 containerd[2014]: time="2026-03-07T00:55:26.445708571Z" level=info msg="metadata content store policy set" policy=shared Mar 7 00:55:26.448717 polkitd[2143]: Finished loading, compiling and executing 2 rules Mar 7 00:55:26.453525 dbus-daemon[1992]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 7 00:55:26.454427 systemd[1]: Started polkit.service - Authorization Manager. Mar 7 00:55:26.458866 polkitd[2143]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 7 00:55:26.465310 containerd[2014]: time="2026-03-07T00:55:26.465240731Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 00:55:26.465458 containerd[2014]: time="2026-03-07T00:55:26.465359819Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 00:55:26.465511 containerd[2014]: time="2026-03-07T00:55:26.465478439Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 00:55:26.465559 containerd[2014]: time="2026-03-07T00:55:26.465518459Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 00:55:26.465607 containerd[2014]: time="2026-03-07T00:55:26.465559091Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 00:55:26.466111 containerd[2014]: time="2026-03-07T00:55:26.465824699Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 00:55:26.478522 containerd[2014]: time="2026-03-07T00:55:26.478341083Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 00:55:26.478683 containerd[2014]: time="2026-03-07T00:55:26.478636163Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 00:55:26.478742 containerd[2014]: time="2026-03-07T00:55:26.478687475Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 00:55:26.478742 containerd[2014]: time="2026-03-07T00:55:26.478721027Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 00:55:26.478828 containerd[2014]: time="2026-03-07T00:55:26.478753187Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.478828 containerd[2014]: time="2026-03-07T00:55:26.478791527Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.478913 containerd[2014]: time="2026-03-07T00:55:26.478822019Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.478913 containerd[2014]: time="2026-03-07T00:55:26.478856411Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.478913 containerd[2014]: time="2026-03-07T00:55:26.478890071Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.479046 containerd[2014]: time="2026-03-07T00:55:26.478919975Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.479046 containerd[2014]: time="2026-03-07T00:55:26.478949927Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.479046 containerd[2014]: time="2026-03-07T00:55:26.478977419Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 00:55:26.479046 containerd[2014]: time="2026-03-07T00:55:26.479017919Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.480307 containerd[2014]: time="2026-03-07T00:55:26.479049587Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.480307 containerd[2014]: time="2026-03-07T00:55:26.479079755Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.480307 containerd[2014]: time="2026-03-07T00:55:26.479111159Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.479153291Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483377543Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483417779Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483455687Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483488003Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483526259Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483559595Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483592643Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483628499Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483666719Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483714119Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483744035Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.483797 containerd[2014]: time="2026-03-07T00:55:26.483770975Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501318647Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501401003Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501435395Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501466199Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501491819Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501524267Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501550151Z" level=info msg="NRI interface is disabled by configuration." Mar 7 00:55:26.502148 containerd[2014]: time="2026-03-07T00:55:26.501575243Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 00:55:26.506314 systemd-networkd[1864]: eth0: Gained IPv6LL Mar 7 00:55:26.512654 containerd[2014]: time="2026-03-07T00:55:26.512488643Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 00:55:26.512654 containerd[2014]: time="2026-03-07T00:55:26.512647871Z" level=info msg="Connect containerd service" Mar 7 00:55:26.512998 containerd[2014]: time="2026-03-07T00:55:26.512719535Z" level=info msg="using legacy CRI server" Mar 7 00:55:26.512998 containerd[2014]: time="2026-03-07T00:55:26.512738891Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 00:55:26.512998 containerd[2014]: time="2026-03-07T00:55:26.512945807Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 00:55:26.519802 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 00:55:26.525681 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 00:55:26.545416 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Mar 7 00:55:26.553284 containerd[2014]: time="2026-03-07T00:55:26.552415559Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553351595Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553480523Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553689227Z" level=info msg="Start subscribing containerd event" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553755971Z" level=info msg="Start recovering state" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553884263Z" level=info msg="Start event monitor" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553908095Z" level=info msg="Start snapshots syncer" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553929971Z" level=info msg="Start cni network conf syncer for default" Mar 7 00:55:26.554801 containerd[2014]: time="2026-03-07T00:55:26.553949627Z" level=info msg="Start streaming server" Mar 7 00:55:26.558962 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:26.567779 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 00:55:26.584551 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 00:55:26.589364 containerd[2014]: time="2026-03-07T00:55:26.585512555Z" level=info msg="containerd successfully booted in 0.310155s" Mar 7 00:55:26.591508 update-ssh-keys[2172]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:55:26.604795 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 7 00:55:26.620349 systemd[1]: Finished sshkeys.service. Mar 7 00:55:26.627802 systemd-hostnamed[2039]: Hostname set to (transient) Mar 7 00:55:26.631474 systemd-resolved[1819]: System hostname changed to 'ip-172-31-19-200'. Mar 7 00:55:26.700582 locksmithd[2047]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 00:55:26.779035 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 00:55:26.790194 amazon-ssm-agent[2183]: Initializing new seelog logger Mar 7 00:55:26.790194 amazon-ssm-agent[2183]: New Seelog Logger Creation Complete Mar 7 00:55:26.790194 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.790194 amazon-ssm-agent[2183]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.790194 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 processing appconfig overrides Mar 7 00:55:26.791822 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.792233 amazon-ssm-agent[2183]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.794193 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 processing appconfig overrides Mar 7 00:55:26.794193 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.794193 amazon-ssm-agent[2183]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.794193 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 processing appconfig overrides Mar 7 00:55:26.795494 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO Proxy environment variables: Mar 7 00:55:26.801289 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.801289 amazon-ssm-agent[2183]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 7 00:55:26.801289 amazon-ssm-agent[2183]: 2026/03/07 00:55:26 processing appconfig overrides Mar 7 00:55:26.898045 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO https_proxy: Mar 7 00:55:27.002191 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO http_proxy: Mar 7 00:55:27.111259 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO no_proxy: Mar 7 00:55:27.208382 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO Checking if agent identity type OnPrem can be assumed Mar 7 00:55:27.306592 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO Checking if agent identity type EC2 can be assumed Mar 7 00:55:27.405983 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO Agent will take identity from EC2 Mar 7 00:55:27.505951 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:55:27.604718 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:55:27.704486 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 7 00:55:27.770354 tar[2018]: linux-arm64/README.md Mar 7 00:55:27.804188 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Mar 7 00:55:27.807304 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 00:55:27.904341 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] Starting Core Agent Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [amazon-ssm-agent] registrar detected. Attempting registration Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [Registrar] Starting registrar module Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:26 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:27 INFO [EC2Identity] EC2 registration was successful. Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:27 INFO [CredentialRefresher] credentialRefresher has started Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:27 INFO [CredentialRefresher] Starting credentials refresher loop Mar 7 00:55:27.915074 amazon-ssm-agent[2183]: 2026-03-07 00:55:27 INFO EC2RoleProvider Successfully connected with instance profile role credentials Mar 7 00:55:27.966799 sshd_keygen[2037]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 00:55:28.003342 amazon-ssm-agent[2183]: 2026-03-07 00:55:27 INFO [CredentialRefresher] Next credential rotation will be in 32.283325103033334 minutes Mar 7 00:55:28.013949 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 00:55:28.028759 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 00:55:28.045818 systemd[1]: Started sshd@0-172.31.19.200:22-20.161.92.111:53262.service - OpenSSH per-connection server daemon (20.161.92.111:53262). Mar 7 00:55:28.062233 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 00:55:28.062638 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 00:55:28.078952 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 00:55:28.118222 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 00:55:28.131822 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 00:55:28.143752 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 7 00:55:28.148588 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 00:55:28.534276 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:28.538671 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 00:55:28.544063 systemd[1]: Startup finished in 1.192s (kernel) + 8.746s (initrd) + 9.028s (userspace) = 18.967s. Mar 7 00:55:28.551725 (kubelet)[2243]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:55:28.593482 sshd[2229]: Accepted publickey for core from 20.161.92.111 port 53262 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:28.596484 sshd[2229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:28.615073 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 00:55:28.624778 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 00:55:28.634052 systemd-logind[2003]: New session 1 of user core. Mar 7 00:55:28.662871 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 00:55:28.675826 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 00:55:28.696293 (systemd)[2250]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 00:55:28.935422 systemd[2250]: Queued start job for default target default.target. Mar 7 00:55:28.945957 systemd[2250]: Created slice app.slice - User Application Slice. Mar 7 00:55:28.946023 systemd[2250]: Reached target paths.target - Paths. Mar 7 00:55:28.946056 systemd[2250]: Reached target timers.target - Timers. Mar 7 00:55:28.954579 systemd[2250]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 00:55:28.975945 systemd[2250]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 00:55:28.978233 amazon-ssm-agent[2183]: 2026-03-07 00:55:28 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Mar 7 00:55:28.977403 systemd[2250]: Reached target sockets.target - Sockets. Mar 7 00:55:28.977441 systemd[2250]: Reached target basic.target - Basic System. Mar 7 00:55:28.977543 systemd[2250]: Reached target default.target - Main User Target. Mar 7 00:55:28.977612 systemd[2250]: Startup finished in 269ms. Mar 7 00:55:28.977731 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 00:55:28.984515 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 00:55:29.078260 amazon-ssm-agent[2183]: 2026-03-07 00:55:28 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2262) started Mar 7 00:55:29.181331 amazon-ssm-agent[2183]: 2026-03-07 00:55:28 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Mar 7 00:55:29.375383 systemd[1]: Started sshd@1-172.31.19.200:22-20.161.92.111:53274.service - OpenSSH per-connection server daemon (20.161.92.111:53274). Mar 7 00:55:29.412472 ntpd[1996]: Listen normally on 7 eth0 [fe80::4d7:31ff:fef3:94f%2]:123 Mar 7 00:55:29.412996 ntpd[1996]: 7 Mar 00:55:29 ntpd[1996]: Listen normally on 7 eth0 [fe80::4d7:31ff:fef3:94f%2]:123 Mar 7 00:55:29.601084 kubelet[2243]: E0307 00:55:29.595779 2243 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:55:29.605128 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:55:29.606572 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:55:29.607127 systemd[1]: kubelet.service: Consumed 1.382s CPU time. Mar 7 00:55:29.881212 sshd[2276]: Accepted publickey for core from 20.161.92.111 port 53274 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:29.882950 sshd[2276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:29.892704 systemd-logind[2003]: New session 2 of user core. Mar 7 00:55:29.903440 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 00:55:30.238790 sshd[2276]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:30.245731 systemd[1]: sshd@1-172.31.19.200:22-20.161.92.111:53274.service: Deactivated successfully. Mar 7 00:55:30.249137 systemd[1]: session-2.scope: Deactivated successfully. Mar 7 00:55:30.250739 systemd-logind[2003]: Session 2 logged out. Waiting for processes to exit. Mar 7 00:55:30.252751 systemd-logind[2003]: Removed session 2. Mar 7 00:55:30.335686 systemd[1]: Started sshd@2-172.31.19.200:22-20.161.92.111:53290.service - OpenSSH per-connection server daemon (20.161.92.111:53290). Mar 7 00:55:30.836888 sshd[2285]: Accepted publickey for core from 20.161.92.111 port 53290 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:30.838609 sshd[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:30.848352 systemd-logind[2003]: New session 3 of user core. Mar 7 00:55:30.856476 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 00:55:31.187255 sshd[2285]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:31.194903 systemd[1]: sshd@2-172.31.19.200:22-20.161.92.111:53290.service: Deactivated successfully. Mar 7 00:55:31.198786 systemd[1]: session-3.scope: Deactivated successfully. Mar 7 00:55:31.200341 systemd-logind[2003]: Session 3 logged out. Waiting for processes to exit. Mar 7 00:55:31.202032 systemd-logind[2003]: Removed session 3. Mar 7 00:55:31.283686 systemd[1]: Started sshd@3-172.31.19.200:22-20.161.92.111:39912.service - OpenSSH per-connection server daemon (20.161.92.111:39912). Mar 7 00:55:31.785214 sshd[2292]: Accepted publickey for core from 20.161.92.111 port 39912 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:31.786896 sshd[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:31.794377 systemd-logind[2003]: New session 4 of user core. Mar 7 00:55:31.807405 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 00:55:32.146394 sshd[2292]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:32.154962 systemd[1]: sshd@3-172.31.19.200:22-20.161.92.111:39912.service: Deactivated successfully. Mar 7 00:55:32.160862 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 00:55:32.166585 systemd-logind[2003]: Session 4 logged out. Waiting for processes to exit. Mar 7 00:55:32.170394 systemd-logind[2003]: Removed session 4. Mar 7 00:55:32.235664 systemd[1]: Started sshd@4-172.31.19.200:22-20.161.92.111:39918.service - OpenSSH per-connection server daemon (20.161.92.111:39918). Mar 7 00:55:31.999450 systemd-resolved[1819]: Clock change detected. Flushing caches. Mar 7 00:55:32.011967 systemd-journald[1578]: Time jumped backwards, rotating. Mar 7 00:55:32.336356 sshd[2299]: Accepted publickey for core from 20.161.92.111 port 39918 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:32.338777 sshd[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:32.346724 systemd-logind[2003]: New session 5 of user core. Mar 7 00:55:32.353472 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 00:55:32.657320 sudo[2303]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 00:55:32.657971 sudo[2303]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:55:32.675490 sudo[2303]: pam_unix(sudo:session): session closed for user root Mar 7 00:55:32.755670 sshd[2299]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:32.762155 systemd[1]: sshd@4-172.31.19.200:22-20.161.92.111:39918.service: Deactivated successfully. Mar 7 00:55:32.765136 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 00:55:32.769919 systemd-logind[2003]: Session 5 logged out. Waiting for processes to exit. Mar 7 00:55:32.772739 systemd-logind[2003]: Removed session 5. Mar 7 00:55:32.849718 systemd[1]: Started sshd@5-172.31.19.200:22-20.161.92.111:39932.service - OpenSSH per-connection server daemon (20.161.92.111:39932). Mar 7 00:55:33.352788 sshd[2308]: Accepted publickey for core from 20.161.92.111 port 39932 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:33.354534 sshd[2308]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:33.363311 systemd-logind[2003]: New session 6 of user core. Mar 7 00:55:33.370460 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 00:55:33.632521 sudo[2312]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 00:55:33.633153 sudo[2312]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:55:33.640414 sudo[2312]: pam_unix(sudo:session): session closed for user root Mar 7 00:55:33.650538 sudo[2311]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 00:55:33.651741 sudo[2311]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:55:33.678947 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 00:55:33.681683 auditctl[2315]: No rules Mar 7 00:55:33.682702 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 00:55:33.683063 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 00:55:33.694904 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:55:33.738394 augenrules[2333]: No rules Mar 7 00:55:33.741329 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:55:33.744047 sudo[2311]: pam_unix(sudo:session): session closed for user root Mar 7 00:55:33.823331 sshd[2308]: pam_unix(sshd:session): session closed for user core Mar 7 00:55:33.831063 systemd[1]: sshd@5-172.31.19.200:22-20.161.92.111:39932.service: Deactivated successfully. Mar 7 00:55:33.836772 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 00:55:33.838552 systemd-logind[2003]: Session 6 logged out. Waiting for processes to exit. Mar 7 00:55:33.840550 systemd-logind[2003]: Removed session 6. Mar 7 00:55:33.919720 systemd[1]: Started sshd@6-172.31.19.200:22-20.161.92.111:39934.service - OpenSSH per-connection server daemon (20.161.92.111:39934). Mar 7 00:55:34.421099 sshd[2341]: Accepted publickey for core from 20.161.92.111 port 39934 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:55:34.423719 sshd[2341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:55:34.431220 systemd-logind[2003]: New session 7 of user core. Mar 7 00:55:34.444459 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 00:55:34.703552 sudo[2344]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 00:55:34.705221 sudo[2344]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:55:35.275724 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 00:55:35.278335 (dockerd)[2359]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 00:55:35.694338 dockerd[2359]: time="2026-03-07T00:55:35.693995366Z" level=info msg="Starting up" Mar 7 00:55:35.831041 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3764580016-merged.mount: Deactivated successfully. Mar 7 00:55:35.871487 systemd[1]: var-lib-docker-metacopy\x2dcheck3013516628-merged.mount: Deactivated successfully. Mar 7 00:55:35.889493 dockerd[2359]: time="2026-03-07T00:55:35.889428879Z" level=info msg="Loading containers: start." Mar 7 00:55:36.052453 kernel: Initializing XFRM netlink socket Mar 7 00:55:36.087539 (udev-worker)[2381]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:55:36.176810 systemd-networkd[1864]: docker0: Link UP Mar 7 00:55:36.203736 dockerd[2359]: time="2026-03-07T00:55:36.203653176Z" level=info msg="Loading containers: done." Mar 7 00:55:36.235228 dockerd[2359]: time="2026-03-07T00:55:36.235081525Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 00:55:36.236060 dockerd[2359]: time="2026-03-07T00:55:36.235468273Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 00:55:36.236060 dockerd[2359]: time="2026-03-07T00:55:36.235693273Z" level=info msg="Daemon has completed initialization" Mar 7 00:55:36.334583 dockerd[2359]: time="2026-03-07T00:55:36.333210361Z" level=info msg="API listen on /run/docker.sock" Mar 7 00:55:36.335602 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 00:55:37.180147 containerd[2014]: time="2026-03-07T00:55:37.180081145Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 7 00:55:37.883411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4287861109.mount: Deactivated successfully. Mar 7 00:55:39.424988 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 00:55:39.432598 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:39.442216 containerd[2014]: time="2026-03-07T00:55:39.440499964Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:39.446471 containerd[2014]: time="2026-03-07T00:55:39.444403804Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=27390174" Mar 7 00:55:39.451367 containerd[2014]: time="2026-03-07T00:55:39.450283889Z" level=info msg="ImageCreate event name:\"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:39.465784 containerd[2014]: time="2026-03-07T00:55:39.463660817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:39.468500 containerd[2014]: time="2026-03-07T00:55:39.468416381Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"27386773\" in 2.288254128s" Mar 7 00:55:39.468683 containerd[2014]: time="2026-03-07T00:55:39.468541925Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\"" Mar 7 00:55:39.470540 containerd[2014]: time="2026-03-07T00:55:39.470473637Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 7 00:55:39.822225 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:39.841043 (kubelet)[2564]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:55:39.928646 kubelet[2564]: E0307 00:55:39.928555 2564 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:55:39.936784 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:55:39.937146 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:55:41.134473 containerd[2014]: time="2026-03-07T00:55:41.134393561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:41.136955 containerd[2014]: time="2026-03-07T00:55:41.136659869Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=23552106" Mar 7 00:55:41.139242 containerd[2014]: time="2026-03-07T00:55:41.139155353Z" level=info msg="ImageCreate event name:\"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:41.150234 containerd[2014]: time="2026-03-07T00:55:41.149786141Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:41.152441 containerd[2014]: time="2026-03-07T00:55:41.152387621Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"25136510\" in 1.681845956s" Mar 7 00:55:41.152616 containerd[2014]: time="2026-03-07T00:55:41.152583809Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\"" Mar 7 00:55:41.153577 containerd[2014]: time="2026-03-07T00:55:41.153413081Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 7 00:55:42.581133 containerd[2014]: time="2026-03-07T00:55:42.581048216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:42.583694 containerd[2014]: time="2026-03-07T00:55:42.583397612Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=18301305" Mar 7 00:55:42.586501 containerd[2014]: time="2026-03-07T00:55:42.586044836Z" level=info msg="ImageCreate event name:\"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:42.592969 containerd[2014]: time="2026-03-07T00:55:42.592886816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:42.595311 containerd[2014]: time="2026-03-07T00:55:42.595262588Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"19885727\" in 1.441789171s" Mar 7 00:55:42.595846 containerd[2014]: time="2026-03-07T00:55:42.595450268Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\"" Mar 7 00:55:42.596801 containerd[2014]: time="2026-03-07T00:55:42.596481356Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 7 00:55:43.918113 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3603589491.mount: Deactivated successfully. Mar 7 00:55:44.554813 containerd[2014]: time="2026-03-07T00:55:44.553467238Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:44.555610 containerd[2014]: time="2026-03-07T00:55:44.555555658Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=28148870" Mar 7 00:55:44.558331 containerd[2014]: time="2026-03-07T00:55:44.558255826Z" level=info msg="ImageCreate event name:\"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:44.563386 containerd[2014]: time="2026-03-07T00:55:44.563312938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:44.565135 containerd[2014]: time="2026-03-07T00:55:44.564916042Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"28147889\" in 1.968345106s" Mar 7 00:55:44.565135 containerd[2014]: time="2026-03-07T00:55:44.564969250Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\"" Mar 7 00:55:44.566246 containerd[2014]: time="2026-03-07T00:55:44.565676314Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 7 00:55:45.163733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1808676753.mount: Deactivated successfully. Mar 7 00:55:46.511223 containerd[2014]: time="2026-03-07T00:55:46.509586384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:46.514790 containerd[2014]: time="2026-03-07T00:55:46.514719396Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Mar 7 00:55:46.516982 containerd[2014]: time="2026-03-07T00:55:46.516935124Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:46.527455 containerd[2014]: time="2026-03-07T00:55:46.527400696Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:46.531737 containerd[2014]: time="2026-03-07T00:55:46.531683424Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.965960322s" Mar 7 00:55:46.532473 containerd[2014]: time="2026-03-07T00:55:46.532439028Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Mar 7 00:55:46.533687 containerd[2014]: time="2026-03-07T00:55:46.533553876Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 7 00:55:47.054620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2849712237.mount: Deactivated successfully. Mar 7 00:55:47.068523 containerd[2014]: time="2026-03-07T00:55:47.068442946Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:47.070687 containerd[2014]: time="2026-03-07T00:55:47.070346266Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 7 00:55:47.074138 containerd[2014]: time="2026-03-07T00:55:47.072830410Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:47.078686 containerd[2014]: time="2026-03-07T00:55:47.078001174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:47.080240 containerd[2014]: time="2026-03-07T00:55:47.079754146Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 545.891906ms" Mar 7 00:55:47.080240 containerd[2014]: time="2026-03-07T00:55:47.079814758Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 7 00:55:47.081677 containerd[2014]: time="2026-03-07T00:55:47.081408706Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 7 00:55:47.671894 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount137616306.mount: Deactivated successfully. Mar 7 00:55:49.172287 containerd[2014]: time="2026-03-07T00:55:49.172179205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:49.174619 containerd[2014]: time="2026-03-07T00:55:49.174509425Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=21885780" Mar 7 00:55:49.177907 containerd[2014]: time="2026-03-07T00:55:49.176713801Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:49.185357 containerd[2014]: time="2026-03-07T00:55:49.185276521Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:55:49.188231 containerd[2014]: time="2026-03-07T00:55:49.187781197Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"21882972\" in 2.106315155s" Mar 7 00:55:49.188231 containerd[2014]: time="2026-03-07T00:55:49.187843993Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"" Mar 7 00:55:50.155244 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 00:55:50.165690 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:50.542571 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:50.545818 (kubelet)[2734]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:55:50.621249 kubelet[2734]: E0307 00:55:50.620429 2734 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:55:50.627060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:55:50.627594 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:55:55.117047 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:55.127729 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:55.185352 systemd[1]: Reloading requested from client PID 2748 ('systemctl') (unit session-7.scope)... Mar 7 00:55:55.185559 systemd[1]: Reloading... Mar 7 00:55:55.438259 zram_generator::config[2792]: No configuration found. Mar 7 00:55:55.684070 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:55:55.864572 systemd[1]: Reloading finished in 678 ms. Mar 7 00:55:55.954646 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 7 00:55:55.955020 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 7 00:55:55.955730 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:55.966875 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:55:56.252262 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 7 00:55:56.345520 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:55:56.350740 (kubelet)[2855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:55:56.428071 kubelet[2855]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:55:56.428071 kubelet[2855]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:55:56.428071 kubelet[2855]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:55:56.428650 kubelet[2855]: I0307 00:55:56.428144 2855 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:55:57.814223 kubelet[2855]: I0307 00:55:57.812400 2855 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 7 00:55:57.814223 kubelet[2855]: I0307 00:55:57.812443 2855 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:55:57.814223 kubelet[2855]: I0307 00:55:57.812878 2855 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:55:57.860518 kubelet[2855]: E0307 00:55:57.860446 2855 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://172.31.19.200:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 00:55:57.861862 kubelet[2855]: I0307 00:55:57.861828 2855 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:55:57.875469 kubelet[2855]: E0307 00:55:57.875416 2855 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:55:57.875708 kubelet[2855]: I0307 00:55:57.875686 2855 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 7 00:55:57.882097 kubelet[2855]: I0307 00:55:57.882058 2855 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 7 00:55:57.884651 kubelet[2855]: I0307 00:55:57.884574 2855 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:55:57.885151 kubelet[2855]: I0307 00:55:57.884838 2855 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-19-200","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:55:57.885466 kubelet[2855]: I0307 00:55:57.885443 2855 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:55:57.885568 kubelet[2855]: I0307 00:55:57.885550 2855 container_manager_linux.go:303] "Creating device plugin manager" Mar 7 00:55:57.886021 kubelet[2855]: I0307 00:55:57.885999 2855 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:55:57.892100 kubelet[2855]: I0307 00:55:57.892056 2855 kubelet.go:480] "Attempting to sync node with API server" Mar 7 00:55:57.892310 kubelet[2855]: I0307 00:55:57.892288 2855 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:55:57.892453 kubelet[2855]: I0307 00:55:57.892435 2855 kubelet.go:386] "Adding apiserver pod source" Mar 7 00:55:57.892571 kubelet[2855]: I0307 00:55:57.892553 2855 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:55:57.899360 kubelet[2855]: E0307 00:55:57.899273 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.19.200:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-19-200&limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:55:57.899571 kubelet[2855]: I0307 00:55:57.899532 2855 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:55:57.900683 kubelet[2855]: I0307 00:55:57.900639 2855 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:55:57.900919 kubelet[2855]: W0307 00:55:57.900891 2855 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 00:55:57.906881 kubelet[2855]: I0307 00:55:57.906831 2855 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 7 00:55:57.907024 kubelet[2855]: I0307 00:55:57.906906 2855 server.go:1289] "Started kubelet" Mar 7 00:55:57.915833 kubelet[2855]: E0307 00:55:57.914960 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.19.200:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:55:57.920511 kubelet[2855]: I0307 00:55:57.919949 2855 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:55:57.923675 kubelet[2855]: E0307 00:55:57.921483 2855 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.19.200:6443/api/v1/namespaces/default/events\": dial tcp 172.31.19.200:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-19-200.189a6918a5e182c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-19-200,UID:ip-172-31-19-200,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-19-200,},FirstTimestamp:2026-03-07 00:55:57.906862788 +0000 UTC m=+1.547404893,LastTimestamp:2026-03-07 00:55:57.906862788 +0000 UTC m=+1.547404893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-19-200,}" Mar 7 00:55:57.925910 kubelet[2855]: I0307 00:55:57.925838 2855 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:55:57.929710 kubelet[2855]: I0307 00:55:57.929608 2855 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:55:57.930595 kubelet[2855]: I0307 00:55:57.930522 2855 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:55:57.934961 kubelet[2855]: I0307 00:55:57.934887 2855 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:55:57.941033 kubelet[2855]: I0307 00:55:57.940178 2855 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 7 00:55:57.941033 kubelet[2855]: E0307 00:55:57.940509 2855 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-19-200\" not found" Mar 7 00:55:57.941033 kubelet[2855]: I0307 00:55:57.940860 2855 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 7 00:55:57.941033 kubelet[2855]: I0307 00:55:57.940935 2855 reconciler.go:26] "Reconciler: start to sync state" Mar 7 00:55:57.941731 kubelet[2855]: E0307 00:55:57.941678 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.19.200:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:55:57.941978 kubelet[2855]: E0307 00:55:57.941888 2855 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": dial tcp 172.31.19.200:6443: connect: connection refused" interval="200ms" Mar 7 00:55:57.943220 kubelet[2855]: I0307 00:55:57.943139 2855 server.go:317] "Adding debug handlers to kubelet server" Mar 7 00:55:57.945395 kubelet[2855]: E0307 00:55:57.945324 2855 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:55:57.946625 kubelet[2855]: I0307 00:55:57.946563 2855 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:55:57.951241 kubelet[2855]: I0307 00:55:57.949898 2855 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:55:57.951241 kubelet[2855]: I0307 00:55:57.949936 2855 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:55:57.970897 kubelet[2855]: I0307 00:55:57.970836 2855 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 7 00:55:57.973441 kubelet[2855]: I0307 00:55:57.973400 2855 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 7 00:55:57.973635 kubelet[2855]: I0307 00:55:57.973608 2855 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 7 00:55:57.973782 kubelet[2855]: I0307 00:55:57.973760 2855 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:55:57.973879 kubelet[2855]: I0307 00:55:57.973861 2855 kubelet.go:2436] "Starting kubelet main sync loop" Mar 7 00:55:57.974048 kubelet[2855]: E0307 00:55:57.974016 2855 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:55:57.982302 kubelet[2855]: E0307 00:55:57.982172 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://172.31.19.200:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 00:55:57.996424 kubelet[2855]: I0307 00:55:57.996385 2855 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:55:57.996424 kubelet[2855]: I0307 00:55:57.996417 2855 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:55:57.996621 kubelet[2855]: I0307 00:55:57.996451 2855 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:55:58.001438 kubelet[2855]: I0307 00:55:58.001382 2855 policy_none.go:49] "None policy: Start" Mar 7 00:55:58.001438 kubelet[2855]: I0307 00:55:58.001427 2855 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 7 00:55:58.001619 kubelet[2855]: I0307 00:55:58.001453 2855 state_mem.go:35] "Initializing new in-memory state store" Mar 7 00:55:58.014256 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 00:55:58.032813 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 00:55:58.041171 kubelet[2855]: E0307 00:55:58.040578 2855 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-19-200\" not found" Mar 7 00:55:58.041217 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 00:55:58.054134 kubelet[2855]: E0307 00:55:58.054090 2855 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:55:58.054645 kubelet[2855]: I0307 00:55:58.054619 2855 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:55:58.055322 kubelet[2855]: I0307 00:55:58.055061 2855 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:55:58.055829 kubelet[2855]: I0307 00:55:58.055797 2855 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:55:58.060299 kubelet[2855]: E0307 00:55:58.059718 2855 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:55:58.060299 kubelet[2855]: E0307 00:55:58.059782 2855 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-19-200\" not found" Mar 7 00:55:58.097763 systemd[1]: Created slice kubepods-burstable-podc346fef0e81c2e206c5f68e4330d8deb.slice - libcontainer container kubepods-burstable-podc346fef0e81c2e206c5f68e4330d8deb.slice. Mar 7 00:55:58.108423 kubelet[2855]: E0307 00:55:58.107906 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:55:58.113955 systemd[1]: Created slice kubepods-burstable-pod9b398926f378c9312e5b48b6db7d941d.slice - libcontainer container kubepods-burstable-pod9b398926f378c9312e5b48b6db7d941d.slice. Mar 7 00:55:58.119798 kubelet[2855]: E0307 00:55:58.119463 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:55:58.123048 systemd[1]: Created slice kubepods-burstable-pod24fd06b56c5757f9336606e95f43b576.slice - libcontainer container kubepods-burstable-pod24fd06b56c5757f9336606e95f43b576.slice. Mar 7 00:55:58.127255 kubelet[2855]: E0307 00:55:58.127038 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:55:58.142495 kubelet[2855]: E0307 00:55:58.142441 2855 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": dial tcp 172.31.19.200:6443: connect: connection refused" interval="400ms" Mar 7 00:55:58.159000 kubelet[2855]: I0307 00:55:58.158873 2855 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:55:58.159795 kubelet[2855]: E0307 00:55:58.159721 2855 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.19.200:6443/api/v1/nodes\": dial tcp 172.31.19.200:6443: connect: connection refused" node="ip-172-31-19-200" Mar 7 00:55:58.242252 kubelet[2855]: I0307 00:55:58.242125 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-k8s-certs\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:55:58.242252 kubelet[2855]: I0307 00:55:58.242217 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:55:58.242426 kubelet[2855]: I0307 00:55:58.242263 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-k8s-certs\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:55:58.242426 kubelet[2855]: I0307 00:55:58.242300 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-kubeconfig\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:55:58.242426 kubelet[2855]: I0307 00:55:58.242339 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:55:58.242426 kubelet[2855]: I0307 00:55:58.242376 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/24fd06b56c5757f9336606e95f43b576-kubeconfig\") pod \"kube-scheduler-ip-172-31-19-200\" (UID: \"24fd06b56c5757f9336606e95f43b576\") " pod="kube-system/kube-scheduler-ip-172-31-19-200" Mar 7 00:55:58.242426 kubelet[2855]: I0307 00:55:58.242409 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-ca-certs\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:55:58.242669 kubelet[2855]: I0307 00:55:58.242447 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:55:58.242669 kubelet[2855]: I0307 00:55:58.242485 2855 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-ca-certs\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:55:58.362308 kubelet[2855]: I0307 00:55:58.362136 2855 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:55:58.362694 kubelet[2855]: E0307 00:55:58.362636 2855 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.19.200:6443/api/v1/nodes\": dial tcp 172.31.19.200:6443: connect: connection refused" node="ip-172-31-19-200" Mar 7 00:55:58.410074 containerd[2014]: time="2026-03-07T00:55:58.409683275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-19-200,Uid:c346fef0e81c2e206c5f68e4330d8deb,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:58.421309 containerd[2014]: time="2026-03-07T00:55:58.421036139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-19-200,Uid:9b398926f378c9312e5b48b6db7d941d,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:58.428738 containerd[2014]: time="2026-03-07T00:55:58.428670107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-19-200,Uid:24fd06b56c5757f9336606e95f43b576,Namespace:kube-system,Attempt:0,}" Mar 7 00:55:58.543639 kubelet[2855]: E0307 00:55:58.543569 2855 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": dial tcp 172.31.19.200:6443: connect: connection refused" interval="800ms" Mar 7 00:55:58.765466 kubelet[2855]: I0307 00:55:58.765407 2855 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:55:58.765947 kubelet[2855]: E0307 00:55:58.765882 2855 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.19.200:6443/api/v1/nodes\": dial tcp 172.31.19.200:6443: connect: connection refused" node="ip-172-31-19-200" Mar 7 00:55:58.783893 kubelet[2855]: E0307 00:55:58.783825 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.19.200:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:55:58.960437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1849056405.mount: Deactivated successfully. Mar 7 00:55:58.971046 containerd[2014]: time="2026-03-07T00:55:58.970967497Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:58.978128 containerd[2014]: time="2026-03-07T00:55:58.978028310Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 7 00:55:58.979925 containerd[2014]: time="2026-03-07T00:55:58.979831202Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:58.982974 containerd[2014]: time="2026-03-07T00:55:58.982883222Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:55:58.985560 containerd[2014]: time="2026-03-07T00:55:58.985397906Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:58.989229 containerd[2014]: time="2026-03-07T00:55:58.988524266Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:58.989634 containerd[2014]: time="2026-03-07T00:55:58.989577794Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:55:58.997903 containerd[2014]: time="2026-03-07T00:55:58.997846934Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:55:59.003612 containerd[2014]: time="2026-03-07T00:55:59.003538522Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 593.748303ms" Mar 7 00:55:59.005737 containerd[2014]: time="2026-03-07T00:55:59.005689222Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 576.911343ms" Mar 7 00:55:59.008950 containerd[2014]: time="2026-03-07T00:55:59.008547562Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 587.403843ms" Mar 7 00:55:59.201102 kubelet[2855]: E0307 00:55:59.201044 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.19.200:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-19-200&limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:55:59.222101 containerd[2014]: time="2026-03-07T00:55:59.221503523Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:59.222101 containerd[2014]: time="2026-03-07T00:55:59.221621411Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:59.222101 containerd[2014]: time="2026-03-07T00:55:59.221658623Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.222101 containerd[2014]: time="2026-03-07T00:55:59.221812043Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.229797 containerd[2014]: time="2026-03-07T00:55:59.229625255Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:59.229797 containerd[2014]: time="2026-03-07T00:55:59.229735307Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:59.230343 containerd[2014]: time="2026-03-07T00:55:59.230029691Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.230343 containerd[2014]: time="2026-03-07T00:55:59.230258327Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.233644 containerd[2014]: time="2026-03-07T00:55:59.233464859Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:55:59.233834 containerd[2014]: time="2026-03-07T00:55:59.233618135Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:55:59.233834 containerd[2014]: time="2026-03-07T00:55:59.233658083Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.233979 containerd[2014]: time="2026-03-07T00:55:59.233888771Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:55:59.254495 kubelet[2855]: E0307 00:55:59.254360 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.19.200:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:55:59.280533 systemd[1]: Started cri-containerd-031d882b5d55255424a6c280977affb8306e33e49cac0d4f862d0c29b089670a.scope - libcontainer container 031d882b5d55255424a6c280977affb8306e33e49cac0d4f862d0c29b089670a. Mar 7 00:55:59.292097 systemd[1]: Started cri-containerd-be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29.scope - libcontainer container be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29. Mar 7 00:55:59.324962 systemd[1]: Started cri-containerd-4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39.scope - libcontainer container 4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39. Mar 7 00:55:59.348362 kubelet[2855]: E0307 00:55:59.348254 2855 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": dial tcp 172.31.19.200:6443: connect: connection refused" interval="1.6s" Mar 7 00:55:59.406305 containerd[2014]: time="2026-03-07T00:55:59.406134084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-19-200,Uid:c346fef0e81c2e206c5f68e4330d8deb,Namespace:kube-system,Attempt:0,} returns sandbox id \"031d882b5d55255424a6c280977affb8306e33e49cac0d4f862d0c29b089670a\"" Mar 7 00:55:59.408408 kubelet[2855]: E0307 00:55:59.408302 2855 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://172.31.19.200:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.19.200:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 00:55:59.422551 containerd[2014]: time="2026-03-07T00:55:59.421480284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-19-200,Uid:9b398926f378c9312e5b48b6db7d941d,Namespace:kube-system,Attempt:0,} returns sandbox id \"be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29\"" Mar 7 00:55:59.428373 containerd[2014]: time="2026-03-07T00:55:59.427910616Z" level=info msg="CreateContainer within sandbox \"031d882b5d55255424a6c280977affb8306e33e49cac0d4f862d0c29b089670a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 00:55:59.434966 containerd[2014]: time="2026-03-07T00:55:59.434901120Z" level=info msg="CreateContainer within sandbox \"be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 00:55:59.462283 containerd[2014]: time="2026-03-07T00:55:59.461893416Z" level=info msg="CreateContainer within sandbox \"031d882b5d55255424a6c280977affb8306e33e49cac0d4f862d0c29b089670a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3e0c28f1365044a6cc66629ff4acffdb339e7392c2087639e64c9062006732ab\"" Mar 7 00:55:59.465129 containerd[2014]: time="2026-03-07T00:55:59.465081612Z" level=info msg="StartContainer for \"3e0c28f1365044a6cc66629ff4acffdb339e7392c2087639e64c9062006732ab\"" Mar 7 00:55:59.471951 containerd[2014]: time="2026-03-07T00:55:59.471781776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-19-200,Uid:24fd06b56c5757f9336606e95f43b576,Namespace:kube-system,Attempt:0,} returns sandbox id \"4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39\"" Mar 7 00:55:59.484450 containerd[2014]: time="2026-03-07T00:55:59.484398720Z" level=info msg="CreateContainer within sandbox \"4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 00:55:59.489943 containerd[2014]: time="2026-03-07T00:55:59.489863100Z" level=info msg="CreateContainer within sandbox \"be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09\"" Mar 7 00:55:59.490727 containerd[2014]: time="2026-03-07T00:55:59.490649748Z" level=info msg="StartContainer for \"01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09\"" Mar 7 00:55:59.522033 containerd[2014]: time="2026-03-07T00:55:59.521822712Z" level=info msg="CreateContainer within sandbox \"4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e\"" Mar 7 00:55:59.526241 containerd[2014]: time="2026-03-07T00:55:59.526051908Z" level=info msg="StartContainer for \"ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e\"" Mar 7 00:55:59.532346 systemd[1]: Started cri-containerd-3e0c28f1365044a6cc66629ff4acffdb339e7392c2087639e64c9062006732ab.scope - libcontainer container 3e0c28f1365044a6cc66629ff4acffdb339e7392c2087639e64c9062006732ab. Mar 7 00:55:59.569887 kubelet[2855]: I0307 00:55:59.569502 2855 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:55:59.570920 kubelet[2855]: E0307 00:55:59.570768 2855 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.19.200:6443/api/v1/nodes\": dial tcp 172.31.19.200:6443: connect: connection refused" node="ip-172-31-19-200" Mar 7 00:55:59.579838 systemd[1]: Started cri-containerd-01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09.scope - libcontainer container 01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09. Mar 7 00:55:59.622600 systemd[1]: Started cri-containerd-ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e.scope - libcontainer container ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e. Mar 7 00:55:59.661989 containerd[2014]: time="2026-03-07T00:55:59.661925413Z" level=info msg="StartContainer for \"3e0c28f1365044a6cc66629ff4acffdb339e7392c2087639e64c9062006732ab\" returns successfully" Mar 7 00:55:59.709817 containerd[2014]: time="2026-03-07T00:55:59.709746397Z" level=info msg="StartContainer for \"01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09\" returns successfully" Mar 7 00:55:59.781641 containerd[2014]: time="2026-03-07T00:55:59.781358149Z" level=info msg="StartContainer for \"ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e\" returns successfully" Mar 7 00:56:00.010749 kubelet[2855]: E0307 00:56:00.010632 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:00.018827 kubelet[2855]: E0307 00:56:00.018494 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:00.022846 kubelet[2855]: E0307 00:56:00.022780 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:01.025316 kubelet[2855]: E0307 00:56:01.023974 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:01.025316 kubelet[2855]: E0307 00:56:01.024582 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:01.173536 kubelet[2855]: I0307 00:56:01.173491 2855 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:56:02.027388 kubelet[2855]: E0307 00:56:02.027037 2855 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:03.714354 kubelet[2855]: E0307 00:56:03.714292 2855 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-19-200\" not found" node="ip-172-31-19-200" Mar 7 00:56:03.799220 kubelet[2855]: I0307 00:56:03.799147 2855 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-19-200" Mar 7 00:56:03.799374 kubelet[2855]: E0307 00:56:03.799230 2855 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ip-172-31-19-200\": node \"ip-172-31-19-200\" not found" Mar 7 00:56:03.841220 kubelet[2855]: I0307 00:56:03.840446 2855 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:03.913406 kubelet[2855]: I0307 00:56:03.913349 2855 apiserver.go:52] "Watching apiserver" Mar 7 00:56:03.922380 kubelet[2855]: E0307 00:56:03.922312 2855 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-19-200\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:03.922380 kubelet[2855]: I0307 00:56:03.922372 2855 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:03.944265 kubelet[2855]: E0307 00:56:03.942438 2855 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-19-200\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:03.944265 kubelet[2855]: I0307 00:56:03.942611 2855 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-19-200" Mar 7 00:56:03.991491 kubelet[2855]: E0307 00:56:03.991340 2855 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-19-200\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-19-200" Mar 7 00:56:04.041342 kubelet[2855]: I0307 00:56:04.041278 2855 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 7 00:56:06.002119 kubelet[2855]: I0307 00:56:06.001895 2855 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:06.350314 systemd[1]: Reloading requested from client PID 3147 ('systemctl') (unit session-7.scope)... Mar 7 00:56:06.350340 systemd[1]: Reloading... Mar 7 00:56:06.550298 zram_generator::config[3193]: No configuration found. Mar 7 00:56:06.777680 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:56:06.995339 systemd[1]: Reloading finished in 644 ms. Mar 7 00:56:07.071543 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:56:07.092168 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 00:56:07.092723 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:56:07.092817 systemd[1]: kubelet.service: Consumed 2.322s CPU time, 127.0M memory peak, 0B memory swap peak. Mar 7 00:56:07.102908 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:56:07.451285 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:56:07.472805 (kubelet)[3248]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:56:07.573080 kubelet[3248]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:56:07.575371 kubelet[3248]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:56:07.575371 kubelet[3248]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:56:07.575371 kubelet[3248]: I0307 00:56:07.573275 3248 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:56:07.589894 kubelet[3248]: I0307 00:56:07.589838 3248 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 7 00:56:07.591662 kubelet[3248]: I0307 00:56:07.590090 3248 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:56:07.591662 kubelet[3248]: I0307 00:56:07.590542 3248 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:56:07.594787 kubelet[3248]: I0307 00:56:07.594748 3248 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 00:56:07.599597 kubelet[3248]: I0307 00:56:07.599552 3248 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:56:07.612296 kubelet[3248]: E0307 00:56:07.612217 3248 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:56:07.612296 kubelet[3248]: I0307 00:56:07.612291 3248 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 7 00:56:07.620759 kubelet[3248]: I0307 00:56:07.620719 3248 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 7 00:56:07.621486 kubelet[3248]: I0307 00:56:07.621436 3248 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:56:07.621848 kubelet[3248]: I0307 00:56:07.621604 3248 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-19-200","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:56:07.622153 kubelet[3248]: I0307 00:56:07.622126 3248 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:56:07.622294 kubelet[3248]: I0307 00:56:07.622275 3248 container_manager_linux.go:303] "Creating device plugin manager" Mar 7 00:56:07.622483 kubelet[3248]: I0307 00:56:07.622464 3248 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:56:07.622895 kubelet[3248]: I0307 00:56:07.622873 3248 kubelet.go:480] "Attempting to sync node with API server" Mar 7 00:56:07.623004 kubelet[3248]: I0307 00:56:07.622985 3248 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:56:07.623149 kubelet[3248]: I0307 00:56:07.623128 3248 kubelet.go:386] "Adding apiserver pod source" Mar 7 00:56:07.623352 kubelet[3248]: I0307 00:56:07.623330 3248 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:56:07.626832 kubelet[3248]: I0307 00:56:07.626763 3248 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:56:07.628677 kubelet[3248]: I0307 00:56:07.628615 3248 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:56:07.636536 kubelet[3248]: I0307 00:56:07.636494 3248 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 7 00:56:07.640217 kubelet[3248]: I0307 00:56:07.637158 3248 server.go:1289] "Started kubelet" Mar 7 00:56:07.650282 kubelet[3248]: I0307 00:56:07.650243 3248 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:56:07.659422 kubelet[3248]: I0307 00:56:07.659353 3248 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:56:07.661115 kubelet[3248]: I0307 00:56:07.661081 3248 server.go:317] "Adding debug handlers to kubelet server" Mar 7 00:56:07.693975 kubelet[3248]: I0307 00:56:07.661298 3248 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:56:07.694943 kubelet[3248]: I0307 00:56:07.694892 3248 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:56:07.698513 kubelet[3248]: I0307 00:56:07.688280 3248 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 7 00:56:07.698700 kubelet[3248]: I0307 00:56:07.676504 3248 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:56:07.698792 kubelet[3248]: I0307 00:56:07.688301 3248 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 7 00:56:07.699600 kubelet[3248]: I0307 00:56:07.693903 3248 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 7 00:56:07.705698 kubelet[3248]: I0307 00:56:07.700677 3248 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:56:07.705698 kubelet[3248]: I0307 00:56:07.700837 3248 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:56:07.705698 kubelet[3248]: E0307 00:56:07.688578 3248 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-19-200\" not found" Mar 7 00:56:07.705698 kubelet[3248]: I0307 00:56:07.702003 3248 reconciler.go:26] "Reconciler: start to sync state" Mar 7 00:56:07.730295 kubelet[3248]: E0307 00:56:07.730251 3248 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:56:07.731048 kubelet[3248]: I0307 00:56:07.731010 3248 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:56:07.775186 kubelet[3248]: I0307 00:56:07.775115 3248 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 7 00:56:07.775575 kubelet[3248]: I0307 00:56:07.775506 3248 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 7 00:56:07.775697 kubelet[3248]: I0307 00:56:07.775583 3248 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:56:07.775697 kubelet[3248]: I0307 00:56:07.775602 3248 kubelet.go:2436] "Starting kubelet main sync loop" Mar 7 00:56:07.775820 kubelet[3248]: E0307 00:56:07.775689 3248 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:56:07.864932 kubelet[3248]: I0307 00:56:07.864888 3248 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:56:07.864932 kubelet[3248]: I0307 00:56:07.864920 3248 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:56:07.865135 kubelet[3248]: I0307 00:56:07.864958 3248 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:56:07.865226 kubelet[3248]: I0307 00:56:07.865176 3248 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 7 00:56:07.865486 kubelet[3248]: I0307 00:56:07.865216 3248 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 7 00:56:07.865486 kubelet[3248]: I0307 00:56:07.865476 3248 policy_none.go:49] "None policy: Start" Mar 7 00:56:07.865654 kubelet[3248]: I0307 00:56:07.865501 3248 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 7 00:56:07.865654 kubelet[3248]: I0307 00:56:07.865524 3248 state_mem.go:35] "Initializing new in-memory state store" Mar 7 00:56:07.865767 kubelet[3248]: I0307 00:56:07.865705 3248 state_mem.go:75] "Updated machine memory state" Mar 7 00:56:07.874218 kubelet[3248]: E0307 00:56:07.874149 3248 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:56:07.874581 kubelet[3248]: I0307 00:56:07.874464 3248 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:56:07.874581 kubelet[3248]: I0307 00:56:07.874503 3248 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:56:07.878888 kubelet[3248]: I0307 00:56:07.878838 3248 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:56:07.883187 kubelet[3248]: I0307 00:56:07.883065 3248 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:07.891946 kubelet[3248]: E0307 00:56:07.891902 3248 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:56:07.895236 kubelet[3248]: I0307 00:56:07.892297 3248 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-19-200" Mar 7 00:56:07.895236 kubelet[3248]: I0307 00:56:07.891972 3248 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.902878 kubelet[3248]: I0307 00:56:07.902828 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-ca-certs\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:07.903108 kubelet[3248]: I0307 00:56:07.903082 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-k8s-certs\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:07.903274 kubelet[3248]: I0307 00:56:07.903246 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c346fef0e81c2e206c5f68e4330d8deb-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-19-200\" (UID: \"c346fef0e81c2e206c5f68e4330d8deb\") " pod="kube-system/kube-apiserver-ip-172-31-19-200" Mar 7 00:56:07.903402 kubelet[3248]: I0307 00:56:07.903377 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.903543 kubelet[3248]: I0307 00:56:07.903516 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-k8s-certs\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.903814 kubelet[3248]: I0307 00:56:07.903775 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-kubeconfig\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.904162 kubelet[3248]: I0307 00:56:07.904037 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.904580 kubelet[3248]: I0307 00:56:07.904451 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9b398926f378c9312e5b48b6db7d941d-ca-certs\") pod \"kube-controller-manager-ip-172-31-19-200\" (UID: \"9b398926f378c9312e5b48b6db7d941d\") " pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.906999 kubelet[3248]: I0307 00:56:07.906843 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/24fd06b56c5757f9336606e95f43b576-kubeconfig\") pod \"kube-scheduler-ip-172-31-19-200\" (UID: \"24fd06b56c5757f9336606e95f43b576\") " pod="kube-system/kube-scheduler-ip-172-31-19-200" Mar 7 00:56:07.931972 kubelet[3248]: E0307 00:56:07.931796 3248 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-19-200\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-19-200" Mar 7 00:56:07.998462 kubelet[3248]: I0307 00:56:07.995659 3248 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-19-200" Mar 7 00:56:08.016450 kubelet[3248]: I0307 00:56:08.016392 3248 kubelet_node_status.go:124] "Node was previously registered" node="ip-172-31-19-200" Mar 7 00:56:08.017182 kubelet[3248]: I0307 00:56:08.016752 3248 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-19-200" Mar 7 00:56:08.626622 kubelet[3248]: I0307 00:56:08.624840 3248 apiserver.go:52] "Watching apiserver" Mar 7 00:56:08.698975 kubelet[3248]: I0307 00:56:08.698873 3248 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 7 00:56:08.791897 kubelet[3248]: I0307 00:56:08.791349 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-19-200" podStartSLOduration=1.791018338 podStartE2EDuration="1.791018338s" podCreationTimestamp="2026-03-07 00:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:08.772221634 +0000 UTC m=+1.290247771" watchObservedRunningTime="2026-03-07 00:56:08.791018338 +0000 UTC m=+1.309044475" Mar 7 00:56:08.810523 kubelet[3248]: I0307 00:56:08.810103 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-19-200" podStartSLOduration=2.810082714 podStartE2EDuration="2.810082714s" podCreationTimestamp="2026-03-07 00:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:08.792925726 +0000 UTC m=+1.310951863" watchObservedRunningTime="2026-03-07 00:56:08.810082714 +0000 UTC m=+1.328108839" Mar 7 00:56:08.839148 kubelet[3248]: I0307 00:56:08.838871 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-19-200" podStartSLOduration=1.838826554 podStartE2EDuration="1.838826554s" podCreationTimestamp="2026-03-07 00:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:08.81273277 +0000 UTC m=+1.330758919" watchObservedRunningTime="2026-03-07 00:56:08.838826554 +0000 UTC m=+1.356852691" Mar 7 00:56:10.165775 update_engine[2005]: I20260307 00:56:10.165413 2005 update_attempter.cc:509] Updating boot flags... Mar 7 00:56:10.266239 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (3311) Mar 7 00:56:10.642236 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 36 scanned by (udev-worker) (3313) Mar 7 00:56:10.694124 kubelet[3248]: I0307 00:56:10.694071 3248 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 00:56:10.696179 kubelet[3248]: I0307 00:56:10.695359 3248 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 00:56:10.696421 containerd[2014]: time="2026-03-07T00:56:10.694723020Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 00:56:11.673225 systemd[1]: Created slice kubepods-besteffort-pod6f1bd94c_d2fa_4897_a2fa_47c9691829e1.slice - libcontainer container kubepods-besteffort-pod6f1bd94c_d2fa_4897_a2fa_47c9691829e1.slice. Mar 7 00:56:11.741417 kubelet[3248]: I0307 00:56:11.741088 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6f1bd94c-d2fa-4897-a2fa-47c9691829e1-kube-proxy\") pod \"kube-proxy-hfclr\" (UID: \"6f1bd94c-d2fa-4897-a2fa-47c9691829e1\") " pod="kube-system/kube-proxy-hfclr" Mar 7 00:56:11.741417 kubelet[3248]: I0307 00:56:11.741154 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6f1bd94c-d2fa-4897-a2fa-47c9691829e1-xtables-lock\") pod \"kube-proxy-hfclr\" (UID: \"6f1bd94c-d2fa-4897-a2fa-47c9691829e1\") " pod="kube-system/kube-proxy-hfclr" Mar 7 00:56:11.741417 kubelet[3248]: I0307 00:56:11.741213 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f1bd94c-d2fa-4897-a2fa-47c9691829e1-lib-modules\") pod \"kube-proxy-hfclr\" (UID: \"6f1bd94c-d2fa-4897-a2fa-47c9691829e1\") " pod="kube-system/kube-proxy-hfclr" Mar 7 00:56:11.741417 kubelet[3248]: I0307 00:56:11.741254 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk6r\" (UniqueName: \"kubernetes.io/projected/6f1bd94c-d2fa-4897-a2fa-47c9691829e1-kube-api-access-9xk6r\") pod \"kube-proxy-hfclr\" (UID: \"6f1bd94c-d2fa-4897-a2fa-47c9691829e1\") " pod="kube-system/kube-proxy-hfclr" Mar 7 00:56:11.782714 systemd[1]: Created slice kubepods-besteffort-podd616e452_89c8_4902_a4f6_09b65a643998.slice - libcontainer container kubepods-besteffort-podd616e452_89c8_4902_a4f6_09b65a643998.slice. Mar 7 00:56:11.841975 kubelet[3248]: I0307 00:56:11.841659 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d616e452-89c8-4902-a4f6-09b65a643998-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-7tlcl\" (UID: \"d616e452-89c8-4902-a4f6-09b65a643998\") " pod="tigera-operator/tigera-operator-6bf85f8dd-7tlcl" Mar 7 00:56:11.841975 kubelet[3248]: I0307 00:56:11.841745 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xg46\" (UniqueName: \"kubernetes.io/projected/d616e452-89c8-4902-a4f6-09b65a643998-kube-api-access-6xg46\") pod \"tigera-operator-6bf85f8dd-7tlcl\" (UID: \"d616e452-89c8-4902-a4f6-09b65a643998\") " pod="tigera-operator/tigera-operator-6bf85f8dd-7tlcl" Mar 7 00:56:11.990497 containerd[2014]: time="2026-03-07T00:56:11.990344750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hfclr,Uid:6f1bd94c-d2fa-4897-a2fa-47c9691829e1,Namespace:kube-system,Attempt:0,}" Mar 7 00:56:12.044627 containerd[2014]: time="2026-03-07T00:56:12.044327926Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:12.044627 containerd[2014]: time="2026-03-07T00:56:12.044520262Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:12.044857 containerd[2014]: time="2026-03-07T00:56:12.044599570Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:12.045371 containerd[2014]: time="2026-03-07T00:56:12.045183802Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:12.089513 systemd[1]: Started cri-containerd-8e84350788d2eddda2fd85368b4623bec66928284bd070aa1679ae94d10923a6.scope - libcontainer container 8e84350788d2eddda2fd85368b4623bec66928284bd070aa1679ae94d10923a6. Mar 7 00:56:12.095253 containerd[2014]: time="2026-03-07T00:56:12.094861487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-7tlcl,Uid:d616e452-89c8-4902-a4f6-09b65a643998,Namespace:tigera-operator,Attempt:0,}" Mar 7 00:56:12.139819 containerd[2014]: time="2026-03-07T00:56:12.139769003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hfclr,Uid:6f1bd94c-d2fa-4897-a2fa-47c9691829e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"8e84350788d2eddda2fd85368b4623bec66928284bd070aa1679ae94d10923a6\"" Mar 7 00:56:12.152031 containerd[2014]: time="2026-03-07T00:56:12.151972139Z" level=info msg="CreateContainer within sandbox \"8e84350788d2eddda2fd85368b4623bec66928284bd070aa1679ae94d10923a6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 00:56:12.163647 containerd[2014]: time="2026-03-07T00:56:12.163323695Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:12.164884 containerd[2014]: time="2026-03-07T00:56:12.164788331Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:12.165159 containerd[2014]: time="2026-03-07T00:56:12.165079991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:12.166425 containerd[2014]: time="2026-03-07T00:56:12.166144175Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:12.192841 containerd[2014]: time="2026-03-07T00:56:12.192770915Z" level=info msg="CreateContainer within sandbox \"8e84350788d2eddda2fd85368b4623bec66928284bd070aa1679ae94d10923a6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d1479ee467365c5511a4e98efc4da607a3367425e40dda75d0f2424788a192e2\"" Mar 7 00:56:12.195708 systemd[1]: Started cri-containerd-8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155.scope - libcontainer container 8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155. Mar 7 00:56:12.198274 containerd[2014]: time="2026-03-07T00:56:12.197976827Z" level=info msg="StartContainer for \"d1479ee467365c5511a4e98efc4da607a3367425e40dda75d0f2424788a192e2\"" Mar 7 00:56:12.273524 systemd[1]: Started cri-containerd-d1479ee467365c5511a4e98efc4da607a3367425e40dda75d0f2424788a192e2.scope - libcontainer container d1479ee467365c5511a4e98efc4da607a3367425e40dda75d0f2424788a192e2. Mar 7 00:56:12.304603 containerd[2014]: time="2026-03-07T00:56:12.304535184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-7tlcl,Uid:d616e452-89c8-4902-a4f6-09b65a643998,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155\"" Mar 7 00:56:12.309427 containerd[2014]: time="2026-03-07T00:56:12.308947440Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 00:56:12.349137 containerd[2014]: time="2026-03-07T00:56:12.348966540Z" level=info msg="StartContainer for \"d1479ee467365c5511a4e98efc4da607a3367425e40dda75d0f2424788a192e2\" returns successfully" Mar 7 00:56:13.496615 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1270613376.mount: Deactivated successfully. Mar 7 00:56:14.746558 containerd[2014]: time="2026-03-07T00:56:14.746477380Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:14.748760 containerd[2014]: time="2026-03-07T00:56:14.748348708Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 7 00:56:14.751018 containerd[2014]: time="2026-03-07T00:56:14.750955924Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:14.756304 containerd[2014]: time="2026-03-07T00:56:14.756224932Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:14.758264 containerd[2014]: time="2026-03-07T00:56:14.757781836Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.448776244s" Mar 7 00:56:14.758264 containerd[2014]: time="2026-03-07T00:56:14.757841524Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 7 00:56:14.770931 containerd[2014]: time="2026-03-07T00:56:14.770835256Z" level=info msg="CreateContainer within sandbox \"8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 00:56:14.812605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1890766621.mount: Deactivated successfully. Mar 7 00:56:14.816321 containerd[2014]: time="2026-03-07T00:56:14.816258400Z" level=info msg="CreateContainer within sandbox \"8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5\"" Mar 7 00:56:14.817714 containerd[2014]: time="2026-03-07T00:56:14.817601920Z" level=info msg="StartContainer for \"183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5\"" Mar 7 00:56:14.875523 systemd[1]: Started cri-containerd-183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5.scope - libcontainer container 183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5. Mar 7 00:56:14.925668 containerd[2014]: time="2026-03-07T00:56:14.925543529Z" level=info msg="StartContainer for \"183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5\" returns successfully" Mar 7 00:56:15.103667 kubelet[3248]: I0307 00:56:15.103261 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hfclr" podStartSLOduration=4.103177706 podStartE2EDuration="4.103177706s" podCreationTimestamp="2026-03-07 00:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:12.86675597 +0000 UTC m=+5.384782107" watchObservedRunningTime="2026-03-07 00:56:15.103177706 +0000 UTC m=+7.621203831" Mar 7 00:56:15.872381 kubelet[3248]: I0307 00:56:15.871456 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-7tlcl" podStartSLOduration=2.419793661 podStartE2EDuration="4.871434341s" podCreationTimestamp="2026-03-07 00:56:11 +0000 UTC" firstStartedPulling="2026-03-07 00:56:12.308237352 +0000 UTC m=+4.826263465" lastFinishedPulling="2026-03-07 00:56:14.759878032 +0000 UTC m=+7.277904145" observedRunningTime="2026-03-07 00:56:15.871325657 +0000 UTC m=+8.389351782" watchObservedRunningTime="2026-03-07 00:56:15.871434341 +0000 UTC m=+8.389460454" Mar 7 00:56:23.448592 sudo[2344]: pam_unix(sudo:session): session closed for user root Mar 7 00:56:23.529441 sshd[2341]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:23.536217 systemd[1]: sshd@6-172.31.19.200:22-20.161.92.111:39934.service: Deactivated successfully. Mar 7 00:56:23.540101 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 00:56:23.542540 systemd[1]: session-7.scope: Consumed 9.781s CPU time, 156.1M memory peak, 0B memory swap peak. Mar 7 00:56:23.545410 systemd-logind[2003]: Session 7 logged out. Waiting for processes to exit. Mar 7 00:56:23.550147 systemd-logind[2003]: Removed session 7. Mar 7 00:56:33.721957 systemd[1]: Created slice kubepods-besteffort-pod83050cbb_4467_4b1d_86b6_3db5d867aab0.slice - libcontainer container kubepods-besteffort-pod83050cbb_4467_4b1d_86b6_3db5d867aab0.slice. Mar 7 00:56:33.788447 kubelet[3248]: I0307 00:56:33.788377 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83050cbb-4467-4b1d-86b6-3db5d867aab0-tigera-ca-bundle\") pod \"calico-typha-59c5d7b97c-q8fgp\" (UID: \"83050cbb-4467-4b1d-86b6-3db5d867aab0\") " pod="calico-system/calico-typha-59c5d7b97c-q8fgp" Mar 7 00:56:33.788447 kubelet[3248]: I0307 00:56:33.788452 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/83050cbb-4467-4b1d-86b6-3db5d867aab0-typha-certs\") pod \"calico-typha-59c5d7b97c-q8fgp\" (UID: \"83050cbb-4467-4b1d-86b6-3db5d867aab0\") " pod="calico-system/calico-typha-59c5d7b97c-q8fgp" Mar 7 00:56:33.789083 kubelet[3248]: I0307 00:56:33.788501 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q28s\" (UniqueName: \"kubernetes.io/projected/83050cbb-4467-4b1d-86b6-3db5d867aab0-kube-api-access-2q28s\") pod \"calico-typha-59c5d7b97c-q8fgp\" (UID: \"83050cbb-4467-4b1d-86b6-3db5d867aab0\") " pod="calico-system/calico-typha-59c5d7b97c-q8fgp" Mar 7 00:56:33.957644 systemd[1]: Created slice kubepods-besteffort-pod34928945_d97f_42e7_9188_1974465be16f.slice - libcontainer container kubepods-besteffort-pod34928945_d97f_42e7_9188_1974465be16f.slice. Mar 7 00:56:33.990608 kubelet[3248]: I0307 00:56:33.990430 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-lib-modules\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.990608 kubelet[3248]: I0307 00:56:33.990550 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-bpffs\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.990802 kubelet[3248]: I0307 00:56:33.990629 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-cni-bin-dir\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.990802 kubelet[3248]: I0307 00:56:33.990762 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/34928945-d97f-42e7-9188-1974465be16f-node-certs\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.990913 kubelet[3248]: I0307 00:56:33.990818 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-cni-net-dir\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.990971 kubelet[3248]: I0307 00:56:33.990939 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-nodeproc\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.991022 kubelet[3248]: I0307 00:56:33.990991 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-xtables-lock\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.991087 kubelet[3248]: I0307 00:56:33.991039 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-policysync\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.991141 kubelet[3248]: I0307 00:56:33.991105 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34928945-d97f-42e7-9188-1974465be16f-tigera-ca-bundle\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.994658 kubelet[3248]: I0307 00:56:33.991167 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-var-run-calico\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.994658 kubelet[3248]: I0307 00:56:33.994328 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqz8z\" (UniqueName: \"kubernetes.io/projected/34928945-d97f-42e7-9188-1974465be16f-kube-api-access-fqz8z\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.994658 kubelet[3248]: I0307 00:56:33.994648 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-sys-fs\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.994928 kubelet[3248]: I0307 00:56:33.994827 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-var-lib-calico\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.996241 kubelet[3248]: I0307 00:56:33.994996 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-cni-log-dir\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:33.996241 kubelet[3248]: I0307 00:56:33.995181 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/34928945-d97f-42e7-9188-1974465be16f-flexvol-driver-host\") pod \"calico-node-h2t96\" (UID: \"34928945-d97f-42e7-9188-1974465be16f\") " pod="calico-system/calico-node-h2t96" Mar 7 00:56:34.031057 containerd[2014]: time="2026-03-07T00:56:34.030943928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59c5d7b97c-q8fgp,Uid:83050cbb-4467-4b1d-86b6-3db5d867aab0,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:34.108628 kubelet[3248]: E0307 00:56:34.108569 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.108628 kubelet[3248]: W0307 00:56:34.108615 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.109410 kubelet[3248]: E0307 00:56:34.108674 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.109410 kubelet[3248]: E0307 00:56:34.109080 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.109410 kubelet[3248]: W0307 00:56:34.109100 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.109410 kubelet[3248]: E0307 00:56:34.109125 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.109822 kubelet[3248]: E0307 00:56:34.109786 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.109822 kubelet[3248]: W0307 00:56:34.109817 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.111231 kubelet[3248]: E0307 00:56:34.109846 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.111231 kubelet[3248]: E0307 00:56:34.111131 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.111231 kubelet[3248]: W0307 00:56:34.111155 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.111231 kubelet[3248]: E0307 00:56:34.111185 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.112813 kubelet[3248]: E0307 00:56:34.112560 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.112813 kubelet[3248]: W0307 00:56:34.112600 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.112813 kubelet[3248]: E0307 00:56:34.112634 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.116124 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.118237 kubelet[3248]: W0307 00:56:34.116161 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.116256 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.116778 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.118237 kubelet[3248]: W0307 00:56:34.116797 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.116831 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.117632 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.118237 kubelet[3248]: W0307 00:56:34.117651 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.118237 kubelet[3248]: E0307 00:56:34.117673 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.120447 kubelet[3248]: E0307 00:56:34.120316 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.120447 kubelet[3248]: W0307 00:56:34.120355 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.120447 kubelet[3248]: E0307 00:56:34.120397 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.122244 kubelet[3248]: E0307 00:56:34.121654 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.122244 kubelet[3248]: W0307 00:56:34.121693 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.122244 kubelet[3248]: E0307 00:56:34.121733 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.127295 kubelet[3248]: E0307 00:56:34.122962 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.127295 kubelet[3248]: W0307 00:56:34.123002 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.127295 kubelet[3248]: E0307 00:56:34.123033 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.127295 kubelet[3248]: E0307 00:56:34.126561 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.127295 kubelet[3248]: W0307 00:56:34.126594 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.127295 kubelet[3248]: E0307 00:56:34.126625 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.129029 kubelet[3248]: E0307 00:56:34.128062 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.129029 kubelet[3248]: W0307 00:56:34.128104 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.129029 kubelet[3248]: E0307 00:56:34.128134 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.129475 kubelet[3248]: E0307 00:56:34.129429 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.129475 kubelet[3248]: W0307 00:56:34.129467 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.129584 kubelet[3248]: E0307 00:56:34.129495 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.132484 kubelet[3248]: E0307 00:56:34.132424 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.132484 kubelet[3248]: W0307 00:56:34.132468 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.132650 kubelet[3248]: E0307 00:56:34.132517 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.134051 kubelet[3248]: E0307 00:56:34.133996 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.134051 kubelet[3248]: W0307 00:56:34.134037 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.134251 kubelet[3248]: E0307 00:56:34.134071 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.137839 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.144506 kubelet[3248]: W0307 00:56:34.137882 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.137916 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.139373 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.144506 kubelet[3248]: W0307 00:56:34.139418 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.139455 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.142073 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.144506 kubelet[3248]: W0307 00:56:34.142101 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.144506 kubelet[3248]: E0307 00:56:34.142133 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.156918 kubelet[3248]: E0307 00:56:34.145074 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.156918 kubelet[3248]: W0307 00:56:34.145114 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.156918 kubelet[3248]: E0307 00:56:34.145310 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.157124 containerd[2014]: time="2026-03-07T00:56:34.136072532Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:34.157124 containerd[2014]: time="2026-03-07T00:56:34.136223768Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:34.157124 containerd[2014]: time="2026-03-07T00:56:34.136257632Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:34.157124 containerd[2014]: time="2026-03-07T00:56:34.136417100Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:34.164032 kubelet[3248]: E0307 00:56:34.162440 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.164032 kubelet[3248]: W0307 00:56:34.162477 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.164032 kubelet[3248]: E0307 00:56:34.162531 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.164032 kubelet[3248]: E0307 00:56:34.163667 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.164032 kubelet[3248]: W0307 00:56:34.163693 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.164032 kubelet[3248]: E0307 00:56:34.163724 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.166246 kubelet[3248]: E0307 00:56:34.164861 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.166246 kubelet[3248]: W0307 00:56:34.164898 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.166246 kubelet[3248]: E0307 00:56:34.164934 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.167571 kubelet[3248]: E0307 00:56:34.167031 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.167571 kubelet[3248]: W0307 00:56:34.167074 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.167571 kubelet[3248]: E0307 00:56:34.167109 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.171292 kubelet[3248]: E0307 00:56:34.169762 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.171292 kubelet[3248]: W0307 00:56:34.169807 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.171292 kubelet[3248]: E0307 00:56:34.169841 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.178244 kubelet[3248]: E0307 00:56:34.177654 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.178244 kubelet[3248]: W0307 00:56:34.177694 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.178244 kubelet[3248]: E0307 00:56:34.177726 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.185555 kubelet[3248]: E0307 00:56:34.185309 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:34.212256 kubelet[3248]: E0307 00:56:34.211767 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.212548 kubelet[3248]: W0307 00:56:34.212438 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.212548 kubelet[3248]: E0307 00:56:34.212489 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.247315 systemd[1]: Started cri-containerd-5e3ada98a12709fa01ff36a4403185f261dd11936a8194384294e04fca89e1fe.scope - libcontainer container 5e3ada98a12709fa01ff36a4403185f261dd11936a8194384294e04fca89e1fe. Mar 7 00:56:34.269769 containerd[2014]: time="2026-03-07T00:56:34.269659701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h2t96,Uid:34928945-d97f-42e7-9188-1974465be16f,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:34.276998 kubelet[3248]: E0307 00:56:34.276521 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.276998 kubelet[3248]: W0307 00:56:34.276556 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.276998 kubelet[3248]: E0307 00:56:34.276588 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.277958 kubelet[3248]: E0307 00:56:34.277541 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.277958 kubelet[3248]: W0307 00:56:34.277572 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.277958 kubelet[3248]: E0307 00:56:34.277648 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.279141 kubelet[3248]: E0307 00:56:34.278987 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.279429 kubelet[3248]: W0307 00:56:34.279320 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.279623 kubelet[3248]: E0307 00:56:34.279563 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.280864 kubelet[3248]: E0307 00:56:34.280493 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.280864 kubelet[3248]: W0307 00:56:34.280526 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.280864 kubelet[3248]: E0307 00:56:34.280564 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.281794 kubelet[3248]: E0307 00:56:34.281458 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.281794 kubelet[3248]: W0307 00:56:34.281490 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.281794 kubelet[3248]: E0307 00:56:34.281520 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.282634 kubelet[3248]: E0307 00:56:34.282576 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.284415 kubelet[3248]: W0307 00:56:34.282810 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.284415 kubelet[3248]: E0307 00:56:34.282848 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.284415 kubelet[3248]: E0307 00:56:34.283423 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.284415 kubelet[3248]: W0307 00:56:34.283447 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.284415 kubelet[3248]: E0307 00:56:34.283472 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.285616 kubelet[3248]: E0307 00:56:34.284180 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.285616 kubelet[3248]: W0307 00:56:34.285328 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.285616 kubelet[3248]: E0307 00:56:34.285373 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.286133 kubelet[3248]: E0307 00:56:34.286106 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.286449 kubelet[3248]: W0307 00:56:34.286261 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.286449 kubelet[3248]: E0307 00:56:34.286296 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.287640 kubelet[3248]: E0307 00:56:34.287386 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.287640 kubelet[3248]: W0307 00:56:34.287419 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.287640 kubelet[3248]: E0307 00:56:34.287450 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.288250 kubelet[3248]: E0307 00:56:34.288182 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.288566 kubelet[3248]: W0307 00:56:34.288348 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.288566 kubelet[3248]: E0307 00:56:34.288384 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.288923 kubelet[3248]: E0307 00:56:34.288899 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.289238 kubelet[3248]: W0307 00:56:34.289062 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.289238 kubelet[3248]: E0307 00:56:34.289096 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.289657 kubelet[3248]: E0307 00:56:34.289633 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.289980 kubelet[3248]: W0307 00:56:34.289744 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.289980 kubelet[3248]: E0307 00:56:34.289810 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.290460 kubelet[3248]: E0307 00:56:34.290433 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.291248 kubelet[3248]: W0307 00:56:34.290611 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.291248 kubelet[3248]: E0307 00:56:34.290649 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.291951 kubelet[3248]: E0307 00:56:34.291700 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.291951 kubelet[3248]: W0307 00:56:34.291732 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.291951 kubelet[3248]: E0307 00:56:34.291768 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.293247 kubelet[3248]: E0307 00:56:34.292481 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.293546 kubelet[3248]: W0307 00:56:34.293505 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.293911 kubelet[3248]: E0307 00:56:34.293667 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.295252 kubelet[3248]: E0307 00:56:34.294470 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.295252 kubelet[3248]: W0307 00:56:34.294520 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.295252 kubelet[3248]: E0307 00:56:34.294554 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.295703 kubelet[3248]: E0307 00:56:34.295674 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.295811 kubelet[3248]: W0307 00:56:34.295786 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.296517 kubelet[3248]: E0307 00:56:34.296473 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.297088 kubelet[3248]: E0307 00:56:34.297057 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.297450 kubelet[3248]: W0307 00:56:34.297250 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.297450 kubelet[3248]: E0307 00:56:34.297292 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.297972 kubelet[3248]: E0307 00:56:34.297812 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.297972 kubelet[3248]: W0307 00:56:34.297838 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.297972 kubelet[3248]: E0307 00:56:34.297862 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.300022 kubelet[3248]: E0307 00:56:34.299972 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.300022 kubelet[3248]: W0307 00:56:34.300008 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.301400 kubelet[3248]: E0307 00:56:34.300041 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.301400 kubelet[3248]: I0307 00:56:34.300097 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s2v8\" (UniqueName: \"kubernetes.io/projected/e10fee9c-8cdc-4998-ad44-21b04777940b-kube-api-access-4s2v8\") pod \"csi-node-driver-6p9l4\" (UID: \"e10fee9c-8cdc-4998-ad44-21b04777940b\") " pod="calico-system/csi-node-driver-6p9l4" Mar 7 00:56:34.301400 kubelet[3248]: E0307 00:56:34.300631 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.301400 kubelet[3248]: W0307 00:56:34.300658 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.301400 kubelet[3248]: E0307 00:56:34.300684 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.301400 kubelet[3248]: I0307 00:56:34.300721 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e10fee9c-8cdc-4998-ad44-21b04777940b-socket-dir\") pod \"csi-node-driver-6p9l4\" (UID: \"e10fee9c-8cdc-4998-ad44-21b04777940b\") " pod="calico-system/csi-node-driver-6p9l4" Mar 7 00:56:34.303605 kubelet[3248]: E0307 00:56:34.303508 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.303605 kubelet[3248]: W0307 00:56:34.303594 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.304087 kubelet[3248]: E0307 00:56:34.303667 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.304087 kubelet[3248]: I0307 00:56:34.303812 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e10fee9c-8cdc-4998-ad44-21b04777940b-kubelet-dir\") pod \"csi-node-driver-6p9l4\" (UID: \"e10fee9c-8cdc-4998-ad44-21b04777940b\") " pod="calico-system/csi-node-driver-6p9l4" Mar 7 00:56:34.304377 kubelet[3248]: E0307 00:56:34.304352 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.306116 kubelet[3248]: W0307 00:56:34.305803 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.306116 kubelet[3248]: E0307 00:56:34.305864 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.307567 kubelet[3248]: E0307 00:56:34.307453 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.307567 kubelet[3248]: W0307 00:56:34.307492 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.307567 kubelet[3248]: E0307 00:56:34.307525 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.311406 kubelet[3248]: E0307 00:56:34.311067 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.311406 kubelet[3248]: W0307 00:56:34.311109 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.311406 kubelet[3248]: E0307 00:56:34.311141 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.312147 kubelet[3248]: E0307 00:56:34.311831 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.312147 kubelet[3248]: W0307 00:56:34.311879 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.312147 kubelet[3248]: E0307 00:56:34.311910 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.312147 kubelet[3248]: I0307 00:56:34.311997 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e10fee9c-8cdc-4998-ad44-21b04777940b-varrun\") pod \"csi-node-driver-6p9l4\" (UID: \"e10fee9c-8cdc-4998-ad44-21b04777940b\") " pod="calico-system/csi-node-driver-6p9l4" Mar 7 00:56:34.314383 kubelet[3248]: E0307 00:56:34.313383 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.314383 kubelet[3248]: W0307 00:56:34.313531 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.314383 kubelet[3248]: E0307 00:56:34.313567 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.316411 kubelet[3248]: E0307 00:56:34.316238 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.316411 kubelet[3248]: W0307 00:56:34.316278 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.316411 kubelet[3248]: E0307 00:56:34.316310 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.318339 kubelet[3248]: E0307 00:56:34.318272 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.318339 kubelet[3248]: W0307 00:56:34.318321 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.318919 kubelet[3248]: E0307 00:56:34.318358 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.320894 kubelet[3248]: E0307 00:56:34.320447 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.320894 kubelet[3248]: W0307 00:56:34.320508 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.320894 kubelet[3248]: E0307 00:56:34.320544 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.325430 kubelet[3248]: E0307 00:56:34.322819 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.325430 kubelet[3248]: W0307 00:56:34.322856 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.325430 kubelet[3248]: E0307 00:56:34.322901 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.330679 kubelet[3248]: E0307 00:56:34.330632 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.330679 kubelet[3248]: W0307 00:56:34.330667 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.330916 kubelet[3248]: E0307 00:56:34.330702 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.330916 kubelet[3248]: I0307 00:56:34.330761 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e10fee9c-8cdc-4998-ad44-21b04777940b-registration-dir\") pod \"csi-node-driver-6p9l4\" (UID: \"e10fee9c-8cdc-4998-ad44-21b04777940b\") " pod="calico-system/csi-node-driver-6p9l4" Mar 7 00:56:34.332584 kubelet[3248]: E0307 00:56:34.332524 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.332584 kubelet[3248]: W0307 00:56:34.332567 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.332750 kubelet[3248]: E0307 00:56:34.332600 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.334258 kubelet[3248]: E0307 00:56:34.334139 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.334258 kubelet[3248]: W0307 00:56:34.334246 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.334666 kubelet[3248]: E0307 00:56:34.334281 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.355771 containerd[2014]: time="2026-03-07T00:56:34.355248813Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:34.355771 containerd[2014]: time="2026-03-07T00:56:34.355370697Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:34.356278 containerd[2014]: time="2026-03-07T00:56:34.355423197Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:34.356278 containerd[2014]: time="2026-03-07T00:56:34.355617129Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:34.400523 systemd[1]: Started cri-containerd-988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47.scope - libcontainer container 988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47. Mar 7 00:56:34.433133 kubelet[3248]: E0307 00:56:34.432806 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.433133 kubelet[3248]: W0307 00:56:34.432842 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.433133 kubelet[3248]: E0307 00:56:34.432874 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.434562 kubelet[3248]: E0307 00:56:34.434523 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.434900 kubelet[3248]: W0307 00:56:34.434747 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.434900 kubelet[3248]: E0307 00:56:34.434795 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.437823 kubelet[3248]: E0307 00:56:34.437754 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.437823 kubelet[3248]: W0307 00:56:34.437814 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.438004 kubelet[3248]: E0307 00:56:34.437850 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.439040 kubelet[3248]: E0307 00:56:34.438782 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.439040 kubelet[3248]: W0307 00:56:34.438842 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.439040 kubelet[3248]: E0307 00:56:34.438874 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.443308 kubelet[3248]: E0307 00:56:34.443249 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.443584 kubelet[3248]: W0307 00:56:34.443312 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.443584 kubelet[3248]: E0307 00:56:34.443348 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.444062 kubelet[3248]: E0307 00:56:34.443948 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.444062 kubelet[3248]: W0307 00:56:34.444057 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.444376 kubelet[3248]: E0307 00:56:34.444086 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.445849 kubelet[3248]: E0307 00:56:34.445767 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.445849 kubelet[3248]: W0307 00:56:34.445804 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.445849 kubelet[3248]: E0307 00:56:34.445837 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.448424 kubelet[3248]: E0307 00:56:34.447971 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.448424 kubelet[3248]: W0307 00:56:34.448007 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.448424 kubelet[3248]: E0307 00:56:34.448040 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.450180 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.454702 kubelet[3248]: W0307 00:56:34.450240 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.450271 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.452050 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.454702 kubelet[3248]: W0307 00:56:34.452075 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.452146 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.452641 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.454702 kubelet[3248]: W0307 00:56:34.452697 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.452786 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.454702 kubelet[3248]: E0307 00:56:34.453357 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.455301 kubelet[3248]: W0307 00:56:34.453381 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.453406 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.453905 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.455301 kubelet[3248]: W0307 00:56:34.453925 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.453946 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.454421 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.455301 kubelet[3248]: W0307 00:56:34.454443 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.454467 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.455301 kubelet[3248]: E0307 00:56:34.454931 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.455301 kubelet[3248]: W0307 00:56:34.454950 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.455787 kubelet[3248]: E0307 00:56:34.454972 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.455787 kubelet[3248]: E0307 00:56:34.455635 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.455787 kubelet[3248]: W0307 00:56:34.455655 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.455787 kubelet[3248]: E0307 00:56:34.455678 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.458255 kubelet[3248]: E0307 00:56:34.456651 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.458255 kubelet[3248]: W0307 00:56:34.456687 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.458255 kubelet[3248]: E0307 00:56:34.456718 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.458255 kubelet[3248]: E0307 00:56:34.457308 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.458255 kubelet[3248]: W0307 00:56:34.457330 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.458255 kubelet[3248]: E0307 00:56:34.457357 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.464417 kubelet[3248]: E0307 00:56:34.464230 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.464417 kubelet[3248]: W0307 00:56:34.464265 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.464417 kubelet[3248]: E0307 00:56:34.464295 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.466045 kubelet[3248]: E0307 00:56:34.465583 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.466045 kubelet[3248]: W0307 00:56:34.465613 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.466045 kubelet[3248]: E0307 00:56:34.465643 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.469094 kubelet[3248]: E0307 00:56:34.468304 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.469094 kubelet[3248]: W0307 00:56:34.468376 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.469094 kubelet[3248]: E0307 00:56:34.468713 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.471319 kubelet[3248]: E0307 00:56:34.471257 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.471677 kubelet[3248]: W0307 00:56:34.471417 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.471677 kubelet[3248]: E0307 00:56:34.471455 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.473163 kubelet[3248]: E0307 00:56:34.472987 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.473163 kubelet[3248]: W0307 00:56:34.473058 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.473693 kubelet[3248]: E0307 00:56:34.473364 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.475533 kubelet[3248]: E0307 00:56:34.475287 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.475533 kubelet[3248]: W0307 00:56:34.475446 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.475533 kubelet[3248]: E0307 00:56:34.475484 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.477024 kubelet[3248]: E0307 00:56:34.476912 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.477024 kubelet[3248]: W0307 00:56:34.476941 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.477024 kubelet[3248]: E0307 00:56:34.476970 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.492018 containerd[2014]: time="2026-03-07T00:56:34.491843638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59c5d7b97c-q8fgp,Uid:83050cbb-4467-4b1d-86b6-3db5d867aab0,Namespace:calico-system,Attempt:0,} returns sandbox id \"5e3ada98a12709fa01ff36a4403185f261dd11936a8194384294e04fca89e1fe\"" Mar 7 00:56:34.493739 kubelet[3248]: E0307 00:56:34.493610 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:34.493739 kubelet[3248]: W0307 00:56:34.493643 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:34.493739 kubelet[3248]: E0307 00:56:34.493677 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:34.500890 containerd[2014]: time="2026-03-07T00:56:34.500399878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 00:56:34.525969 containerd[2014]: time="2026-03-07T00:56:34.525793234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h2t96,Uid:34928945-d97f-42e7-9188-1974465be16f,Namespace:calico-system,Attempt:0,} returns sandbox id \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\"" Mar 7 00:56:35.778034 kubelet[3248]: E0307 00:56:35.777690 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:35.799617 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3906185429.mount: Deactivated successfully. Mar 7 00:56:36.706870 containerd[2014]: time="2026-03-07T00:56:36.706770697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:36.709558 containerd[2014]: time="2026-03-07T00:56:36.709488517Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 7 00:56:36.710822 containerd[2014]: time="2026-03-07T00:56:36.710739961Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:36.718294 containerd[2014]: time="2026-03-07T00:56:36.717800713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:36.719835 containerd[2014]: time="2026-03-07T00:56:36.719765077Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.219301431s" Mar 7 00:56:36.720003 containerd[2014]: time="2026-03-07T00:56:36.719833549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 7 00:56:36.724420 containerd[2014]: time="2026-03-07T00:56:36.724346785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 00:56:36.770174 containerd[2014]: time="2026-03-07T00:56:36.769845049Z" level=info msg="CreateContainer within sandbox \"5e3ada98a12709fa01ff36a4403185f261dd11936a8194384294e04fca89e1fe\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 00:56:36.794807 containerd[2014]: time="2026-03-07T00:56:36.794727841Z" level=info msg="CreateContainer within sandbox \"5e3ada98a12709fa01ff36a4403185f261dd11936a8194384294e04fca89e1fe\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"66d06435c7f80df6102d9d61f712887cefa2ab842cb545b20450b1d72aaa6b44\"" Mar 7 00:56:36.795906 containerd[2014]: time="2026-03-07T00:56:36.795846589Z" level=info msg="StartContainer for \"66d06435c7f80df6102d9d61f712887cefa2ab842cb545b20450b1d72aaa6b44\"" Mar 7 00:56:36.861652 systemd[1]: Started cri-containerd-66d06435c7f80df6102d9d61f712887cefa2ab842cb545b20450b1d72aaa6b44.scope - libcontainer container 66d06435c7f80df6102d9d61f712887cefa2ab842cb545b20450b1d72aaa6b44. Mar 7 00:56:36.935134 containerd[2014]: time="2026-03-07T00:56:36.933696758Z" level=info msg="StartContainer for \"66d06435c7f80df6102d9d61f712887cefa2ab842cb545b20450b1d72aaa6b44\" returns successfully" Mar 7 00:56:36.964402 kubelet[3248]: I0307 00:56:36.964173 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-59c5d7b97c-q8fgp" podStartSLOduration=1.7379116190000001 podStartE2EDuration="3.964149638s" podCreationTimestamp="2026-03-07 00:56:33 +0000 UTC" firstStartedPulling="2026-03-07 00:56:34.497766862 +0000 UTC m=+27.015792987" lastFinishedPulling="2026-03-07 00:56:36.724004869 +0000 UTC m=+29.242031006" observedRunningTime="2026-03-07 00:56:36.96398381 +0000 UTC m=+29.482009959" watchObservedRunningTime="2026-03-07 00:56:36.964149638 +0000 UTC m=+29.482175775" Mar 7 00:56:37.017809 kubelet[3248]: E0307 00:56:37.017757 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.017809 kubelet[3248]: W0307 00:56:37.017796 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.018031 kubelet[3248]: E0307 00:56:37.017829 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.018508 kubelet[3248]: E0307 00:56:37.018454 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.018633 kubelet[3248]: W0307 00:56:37.018508 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.018633 kubelet[3248]: E0307 00:56:37.018561 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.019772 kubelet[3248]: E0307 00:56:37.019611 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.019772 kubelet[3248]: W0307 00:56:37.019677 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.019772 kubelet[3248]: E0307 00:56:37.019711 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.020861 kubelet[3248]: E0307 00:56:37.020809 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.020861 kubelet[3248]: W0307 00:56:37.020858 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.021021 kubelet[3248]: E0307 00:56:37.020887 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.021467 kubelet[3248]: E0307 00:56:37.021436 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.021559 kubelet[3248]: W0307 00:56:37.021465 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.021559 kubelet[3248]: E0307 00:56:37.021506 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.022697 kubelet[3248]: E0307 00:56:37.022646 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.022697 kubelet[3248]: W0307 00:56:37.022682 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.022881 kubelet[3248]: E0307 00:56:37.022715 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.023992 kubelet[3248]: E0307 00:56:37.023937 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.023992 kubelet[3248]: W0307 00:56:37.023975 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.024280 kubelet[3248]: E0307 00:56:37.024008 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.024280 kubelet[3248]: E0307 00:56:37.024621 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.024280 kubelet[3248]: W0307 00:56:37.024642 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.024667 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025058 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.026930 kubelet[3248]: W0307 00:56:37.025076 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025098 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025479 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.026930 kubelet[3248]: W0307 00:56:37.025497 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025518 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025888 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.026930 kubelet[3248]: W0307 00:56:37.025906 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.026930 kubelet[3248]: E0307 00:56:37.025926 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.028118 kubelet[3248]: E0307 00:56:37.026297 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.028118 kubelet[3248]: W0307 00:56:37.026314 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.028118 kubelet[3248]: E0307 00:56:37.026335 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.028118 kubelet[3248]: E0307 00:56:37.027276 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.028118 kubelet[3248]: W0307 00:56:37.027305 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.028118 kubelet[3248]: E0307 00:56:37.027333 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.029302 kubelet[3248]: E0307 00:56:37.028986 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.029302 kubelet[3248]: W0307 00:56:37.029018 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.029302 kubelet[3248]: E0307 00:56:37.029049 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.031255 kubelet[3248]: E0307 00:56:37.029565 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.031255 kubelet[3248]: W0307 00:56:37.029586 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.031255 kubelet[3248]: E0307 00:56:37.029612 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.082078 kubelet[3248]: E0307 00:56:37.082042 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.082078 kubelet[3248]: W0307 00:56:37.082133 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.082078 kubelet[3248]: E0307 00:56:37.082167 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.083919 kubelet[3248]: E0307 00:56:37.083749 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.083919 kubelet[3248]: W0307 00:56:37.083783 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.083919 kubelet[3248]: E0307 00:56:37.083816 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.084606 kubelet[3248]: E0307 00:56:37.084474 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.084606 kubelet[3248]: W0307 00:56:37.084533 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.084606 kubelet[3248]: E0307 00:56:37.084563 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.086308 kubelet[3248]: E0307 00:56:37.085017 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.086308 kubelet[3248]: W0307 00:56:37.085037 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.086308 kubelet[3248]: E0307 00:56:37.085061 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.086308 kubelet[3248]: E0307 00:56:37.085395 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.086308 kubelet[3248]: W0307 00:56:37.085412 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.086308 kubelet[3248]: E0307 00:56:37.085432 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.087333 kubelet[3248]: E0307 00:56:37.087290 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.087333 kubelet[3248]: W0307 00:56:37.087329 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.087467 kubelet[3248]: E0307 00:56:37.087363 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.087945 kubelet[3248]: E0307 00:56:37.087908 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.087945 kubelet[3248]: W0307 00:56:37.087939 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.088176 kubelet[3248]: E0307 00:56:37.087966 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.088474 kubelet[3248]: E0307 00:56:37.088438 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.088474 kubelet[3248]: W0307 00:56:37.088468 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.088655 kubelet[3248]: E0307 00:56:37.088493 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.090454 kubelet[3248]: E0307 00:56:37.090391 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.090454 kubelet[3248]: W0307 00:56:37.090430 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.090887 kubelet[3248]: E0307 00:56:37.090471 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.091671 kubelet[3248]: E0307 00:56:37.091621 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.091671 kubelet[3248]: W0307 00:56:37.091659 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.091859 kubelet[3248]: E0307 00:56:37.091691 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.093418 kubelet[3248]: E0307 00:56:37.093361 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.093418 kubelet[3248]: W0307 00:56:37.093401 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.093621 kubelet[3248]: E0307 00:56:37.093434 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.093979 kubelet[3248]: E0307 00:56:37.093847 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.093979 kubelet[3248]: W0307 00:56:37.093877 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.093979 kubelet[3248]: E0307 00:56:37.093901 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.095582 kubelet[3248]: E0307 00:56:37.095396 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.095582 kubelet[3248]: W0307 00:56:37.095465 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.095582 kubelet[3248]: E0307 00:56:37.095500 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.097278 kubelet[3248]: E0307 00:56:37.096630 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.097278 kubelet[3248]: W0307 00:56:37.096664 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.097278 kubelet[3248]: E0307 00:56:37.096694 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.098401 kubelet[3248]: E0307 00:56:37.097971 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.098401 kubelet[3248]: W0307 00:56:37.098003 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.098401 kubelet[3248]: E0307 00:56:37.098033 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.100888 kubelet[3248]: E0307 00:56:37.100709 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.100888 kubelet[3248]: W0307 00:56:37.100743 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.100888 kubelet[3248]: E0307 00:56:37.100774 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.105570 kubelet[3248]: E0307 00:56:37.105159 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.105570 kubelet[3248]: W0307 00:56:37.105244 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.105570 kubelet[3248]: E0307 00:56:37.105297 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.107382 kubelet[3248]: E0307 00:56:37.105806 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:37.107382 kubelet[3248]: W0307 00:56:37.105830 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:37.107382 kubelet[3248]: E0307 00:56:37.105854 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:37.776934 kubelet[3248]: E0307 00:56:37.776846 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:37.945162 kubelet[3248]: I0307 00:56:37.945119 3248 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:56:37.947636 containerd[2014]: time="2026-03-07T00:56:37.947165691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:37.949829 containerd[2014]: time="2026-03-07T00:56:37.949578351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 7 00:56:37.953230 containerd[2014]: time="2026-03-07T00:56:37.951903987Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:37.958713 containerd[2014]: time="2026-03-07T00:56:37.957655875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:37.960903 containerd[2014]: time="2026-03-07T00:56:37.960828231Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.236406782s" Mar 7 00:56:37.960903 containerd[2014]: time="2026-03-07T00:56:37.960893451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 7 00:56:37.969290 containerd[2014]: time="2026-03-07T00:56:37.969216231Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 00:56:37.998580 containerd[2014]: time="2026-03-07T00:56:37.998446107Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb\"" Mar 7 00:56:38.000333 containerd[2014]: time="2026-03-07T00:56:38.000262679Z" level=info msg="StartContainer for \"ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb\"" Mar 7 00:56:38.035626 kubelet[3248]: E0307 00:56:38.035477 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.035626 kubelet[3248]: W0307 00:56:38.035520 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.035626 kubelet[3248]: E0307 00:56:38.035579 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.037778 kubelet[3248]: E0307 00:56:38.037724 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.037943 kubelet[3248]: W0307 00:56:38.037765 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.037943 kubelet[3248]: E0307 00:56:38.037886 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.038672 kubelet[3248]: E0307 00:56:38.038604 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.038672 kubelet[3248]: W0307 00:56:38.038642 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.038830 kubelet[3248]: E0307 00:56:38.038696 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.039309 kubelet[3248]: E0307 00:56:38.039243 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.039309 kubelet[3248]: W0307 00:56:38.039278 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.039585 kubelet[3248]: E0307 00:56:38.039330 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.040735 kubelet[3248]: E0307 00:56:38.040694 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.040883 kubelet[3248]: W0307 00:56:38.040731 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.040883 kubelet[3248]: E0307 00:56:38.040824 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.043575 kubelet[3248]: E0307 00:56:38.043370 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.043575 kubelet[3248]: W0307 00:56:38.043432 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.043575 kubelet[3248]: E0307 00:56:38.043469 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.044276 kubelet[3248]: E0307 00:56:38.043985 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.044276 kubelet[3248]: W0307 00:56:38.044018 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.044276 kubelet[3248]: E0307 00:56:38.044045 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.044862 kubelet[3248]: E0307 00:56:38.044480 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.044862 kubelet[3248]: W0307 00:56:38.044499 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.044862 kubelet[3248]: E0307 00:56:38.044521 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.045042 kubelet[3248]: E0307 00:56:38.044932 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.045042 kubelet[3248]: W0307 00:56:38.044949 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.045042 kubelet[3248]: E0307 00:56:38.044970 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.045578 kubelet[3248]: E0307 00:56:38.045291 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.045578 kubelet[3248]: W0307 00:56:38.045318 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.045578 kubelet[3248]: E0307 00:56:38.045355 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.045937 kubelet[3248]: E0307 00:56:38.045716 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.045937 kubelet[3248]: W0307 00:56:38.045732 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.045937 kubelet[3248]: E0307 00:56:38.045752 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.046641 kubelet[3248]: E0307 00:56:38.046166 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.046641 kubelet[3248]: W0307 00:56:38.046225 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.046641 kubelet[3248]: E0307 00:56:38.046275 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.047462 kubelet[3248]: E0307 00:56:38.046749 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.047462 kubelet[3248]: W0307 00:56:38.046767 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.047462 kubelet[3248]: E0307 00:56:38.046792 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.047462 kubelet[3248]: E0307 00:56:38.047218 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.047462 kubelet[3248]: W0307 00:56:38.047236 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.047462 kubelet[3248]: E0307 00:56:38.047255 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.048356 kubelet[3248]: E0307 00:56:38.047633 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.048356 kubelet[3248]: W0307 00:56:38.047651 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.048356 kubelet[3248]: E0307 00:56:38.047671 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.073110 systemd[1]: Started cri-containerd-ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb.scope - libcontainer container ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb. Mar 7 00:56:38.099781 kubelet[3248]: E0307 00:56:38.099706 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.099781 kubelet[3248]: W0307 00:56:38.099767 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.099987 kubelet[3248]: E0307 00:56:38.099821 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.100935 kubelet[3248]: E0307 00:56:38.100434 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.100935 kubelet[3248]: W0307 00:56:38.100463 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.100935 kubelet[3248]: E0307 00:56:38.100511 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.101140 kubelet[3248]: E0307 00:56:38.101019 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.101140 kubelet[3248]: W0307 00:56:38.101061 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.101140 kubelet[3248]: E0307 00:56:38.101086 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.101595 kubelet[3248]: E0307 00:56:38.101557 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.101595 kubelet[3248]: W0307 00:56:38.101586 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.101708 kubelet[3248]: E0307 00:56:38.101609 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.102235 kubelet[3248]: E0307 00:56:38.102080 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.102235 kubelet[3248]: W0307 00:56:38.102109 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.102235 kubelet[3248]: E0307 00:56:38.102132 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.103288 kubelet[3248]: E0307 00:56:38.102662 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.103288 kubelet[3248]: W0307 00:56:38.102717 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.103288 kubelet[3248]: E0307 00:56:38.102802 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.103498 kubelet[3248]: E0307 00:56:38.103472 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.103498 kubelet[3248]: W0307 00:56:38.103492 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.103654 kubelet[3248]: E0307 00:56:38.103517 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.104063 kubelet[3248]: E0307 00:56:38.104023 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.104063 kubelet[3248]: W0307 00:56:38.104053 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.104247 kubelet[3248]: E0307 00:56:38.104077 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.105258 kubelet[3248]: E0307 00:56:38.105155 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.105258 kubelet[3248]: W0307 00:56:38.105222 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.105258 kubelet[3248]: E0307 00:56:38.105255 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.106779 kubelet[3248]: E0307 00:56:38.106250 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.106779 kubelet[3248]: W0307 00:56:38.106433 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.106779 kubelet[3248]: E0307 00:56:38.106509 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.107519 kubelet[3248]: E0307 00:56:38.107179 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.107519 kubelet[3248]: W0307 00:56:38.107242 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.107519 kubelet[3248]: E0307 00:56:38.107270 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.108151 kubelet[3248]: E0307 00:56:38.107851 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.108151 kubelet[3248]: W0307 00:56:38.107885 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.108151 kubelet[3248]: E0307 00:56:38.107945 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.109286 kubelet[3248]: E0307 00:56:38.109242 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.109286 kubelet[3248]: W0307 00:56:38.109277 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.109472 kubelet[3248]: E0307 00:56:38.109308 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.110263 kubelet[3248]: E0307 00:56:38.109916 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.110263 kubelet[3248]: W0307 00:56:38.109968 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.110263 kubelet[3248]: E0307 00:56:38.109992 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.111569 kubelet[3248]: E0307 00:56:38.111519 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.111698 kubelet[3248]: W0307 00:56:38.111610 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.111698 kubelet[3248]: E0307 00:56:38.111650 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.113267 kubelet[3248]: E0307 00:56:38.112790 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.113267 kubelet[3248]: W0307 00:56:38.112879 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.113267 kubelet[3248]: E0307 00:56:38.112911 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.113922 kubelet[3248]: E0307 00:56:38.113848 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.113922 kubelet[3248]: W0307 00:56:38.113911 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.114084 kubelet[3248]: E0307 00:56:38.113981 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.115759 kubelet[3248]: E0307 00:56:38.115494 3248 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:56:38.115759 kubelet[3248]: W0307 00:56:38.115533 3248 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:56:38.115759 kubelet[3248]: E0307 00:56:38.115590 3248 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:56:38.155861 containerd[2014]: time="2026-03-07T00:56:38.155808036Z" level=info msg="StartContainer for \"ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb\" returns successfully" Mar 7 00:56:38.190949 systemd[1]: cri-containerd-ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb.scope: Deactivated successfully. Mar 7 00:56:38.682983 containerd[2014]: time="2026-03-07T00:56:38.682881387Z" level=info msg="shim disconnected" id=ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb namespace=k8s.io Mar 7 00:56:38.683249 containerd[2014]: time="2026-03-07T00:56:38.683000679Z" level=warning msg="cleaning up after shim disconnected" id=ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb namespace=k8s.io Mar 7 00:56:38.683249 containerd[2014]: time="2026-03-07T00:56:38.683023707Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:56:38.736288 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ea84a4cb5b1756591bb91727a1d4f55a9055e11035d456e864f23e071d572ebb-rootfs.mount: Deactivated successfully. Mar 7 00:56:38.952132 containerd[2014]: time="2026-03-07T00:56:38.951688264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 00:56:39.781294 kubelet[3248]: E0307 00:56:39.780781 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:41.778939 kubelet[3248]: E0307 00:56:41.778861 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:43.778571 kubelet[3248]: E0307 00:56:43.777867 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:45.302630 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1495680882.mount: Deactivated successfully. Mar 7 00:56:45.367260 containerd[2014]: time="2026-03-07T00:56:45.366872996Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:45.369165 containerd[2014]: time="2026-03-07T00:56:45.368893592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 7 00:56:45.372461 containerd[2014]: time="2026-03-07T00:56:45.371097920Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:45.376313 containerd[2014]: time="2026-03-07T00:56:45.376239620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:45.378090 containerd[2014]: time="2026-03-07T00:56:45.377908472Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.426162752s" Mar 7 00:56:45.378090 containerd[2014]: time="2026-03-07T00:56:45.377964404Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 7 00:56:45.388183 containerd[2014]: time="2026-03-07T00:56:45.388059272Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 00:56:45.421875 containerd[2014]: time="2026-03-07T00:56:45.421622852Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431\"" Mar 7 00:56:45.425295 containerd[2014]: time="2026-03-07T00:56:45.424386236Z" level=info msg="StartContainer for \"ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431\"" Mar 7 00:56:45.483576 systemd[1]: Started cri-containerd-ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431.scope - libcontainer container ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431. Mar 7 00:56:45.537371 containerd[2014]: time="2026-03-07T00:56:45.537313797Z" level=info msg="StartContainer for \"ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431\" returns successfully" Mar 7 00:56:45.743771 systemd[1]: cri-containerd-ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431.scope: Deactivated successfully. Mar 7 00:56:45.786698 kubelet[3248]: E0307 00:56:45.784796 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:46.301644 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431-rootfs.mount: Deactivated successfully. Mar 7 00:56:46.308688 containerd[2014]: time="2026-03-07T00:56:46.308482209Z" level=info msg="shim disconnected" id=ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431 namespace=k8s.io Mar 7 00:56:46.308688 containerd[2014]: time="2026-03-07T00:56:46.308618361Z" level=warning msg="cleaning up after shim disconnected" id=ae15179eed4b698e4f8aafb1dd2253a3c588eb11cc107682e454367de5521431 namespace=k8s.io Mar 7 00:56:46.308688 containerd[2014]: time="2026-03-07T00:56:46.308647389Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:56:46.329262 containerd[2014]: time="2026-03-07T00:56:46.328946397Z" level=warning msg="cleanup warnings time=\"2026-03-07T00:56:46Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 7 00:56:46.983298 containerd[2014]: time="2026-03-07T00:56:46.982895556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 00:56:47.778415 kubelet[3248]: E0307 00:56:47.777333 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:49.776359 kubelet[3248]: E0307 00:56:49.776285 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:49.983229 containerd[2014]: time="2026-03-07T00:56:49.981694671Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:49.984006 containerd[2014]: time="2026-03-07T00:56:49.983960163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 7 00:56:49.986078 containerd[2014]: time="2026-03-07T00:56:49.986036271Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:49.992862 containerd[2014]: time="2026-03-07T00:56:49.992794707Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:49.996490 containerd[2014]: time="2026-03-07T00:56:49.995517951Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.012541179s" Mar 7 00:56:49.996490 containerd[2014]: time="2026-03-07T00:56:49.995602083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 7 00:56:50.006027 containerd[2014]: time="2026-03-07T00:56:50.005951615Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 00:56:50.032739 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount716383844.mount: Deactivated successfully. Mar 7 00:56:50.040315 containerd[2014]: time="2026-03-07T00:56:50.040242455Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916\"" Mar 7 00:56:50.043455 containerd[2014]: time="2026-03-07T00:56:50.043159667Z" level=info msg="StartContainer for \"f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916\"" Mar 7 00:56:50.106581 systemd[1]: run-containerd-runc-k8s.io-f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916-runc.87t2YA.mount: Deactivated successfully. Mar 7 00:56:50.119514 systemd[1]: Started cri-containerd-f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916.scope - libcontainer container f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916. Mar 7 00:56:50.177529 containerd[2014]: time="2026-03-07T00:56:50.177263400Z" level=info msg="StartContainer for \"f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916\" returns successfully" Mar 7 00:56:51.657949 kubelet[3248]: I0307 00:56:51.657783 3248 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:56:51.778311 kubelet[3248]: E0307 00:56:51.776062 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6p9l4" podUID="e10fee9c-8cdc-4998-ad44-21b04777940b" Mar 7 00:56:52.044431 containerd[2014]: time="2026-03-07T00:56:52.043875877Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:56:52.047976 systemd[1]: cri-containerd-f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916.scope: Deactivated successfully. Mar 7 00:56:52.048988 systemd[1]: cri-containerd-f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916.scope: Consumed 1.021s CPU time. Mar 7 00:56:52.069743 kubelet[3248]: I0307 00:56:52.068848 3248 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 7 00:56:52.139825 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916-rootfs.mount: Deactivated successfully. Mar 7 00:56:52.187455 systemd[1]: Created slice kubepods-burstable-podd7c61432_fbce_407b_bac9_6d0016af3dc6.slice - libcontainer container kubepods-burstable-podd7c61432_fbce_407b_bac9_6d0016af3dc6.slice. Mar 7 00:56:52.199186 containerd[2014]: time="2026-03-07T00:56:52.198735626Z" level=info msg="shim disconnected" id=f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916 namespace=k8s.io Mar 7 00:56:52.199186 containerd[2014]: time="2026-03-07T00:56:52.198833978Z" level=warning msg="cleaning up after shim disconnected" id=f801a6e02b9c3a3bc35830a5ff012f714dd4aab47bcd5a744ce670ac41edf916 namespace=k8s.io Mar 7 00:56:52.199186 containerd[2014]: time="2026-03-07T00:56:52.198855002Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:56:52.211660 systemd[1]: Created slice kubepods-besteffort-pod2f36088b_c8bd_4e60_bd21_33bf72d4d138.slice - libcontainer container kubepods-besteffort-pod2f36088b_c8bd_4e60_bd21_33bf72d4d138.slice. Mar 7 00:56:52.215134 kubelet[3248]: I0307 00:56:52.214973 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-backend-key-pair\") pod \"whisker-6b786f9cf8-kdqrt\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:52.215134 kubelet[3248]: I0307 00:56:52.215059 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxl6k\" (UniqueName: \"kubernetes.io/projected/d7c61432-fbce-407b-bac9-6d0016af3dc6-kube-api-access-gxl6k\") pod \"coredns-674b8bbfcf-cxswj\" (UID: \"d7c61432-fbce-407b-bac9-6d0016af3dc6\") " pod="kube-system/coredns-674b8bbfcf-cxswj" Mar 7 00:56:52.215134 kubelet[3248]: I0307 00:56:52.215107 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-nginx-config\") pod \"whisker-6b786f9cf8-kdqrt\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:52.217590 kubelet[3248]: I0307 00:56:52.215144 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-ca-bundle\") pod \"whisker-6b786f9cf8-kdqrt\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:52.217590 kubelet[3248]: I0307 00:56:52.215211 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c61432-fbce-407b-bac9-6d0016af3dc6-config-volume\") pod \"coredns-674b8bbfcf-cxswj\" (UID: \"d7c61432-fbce-407b-bac9-6d0016af3dc6\") " pod="kube-system/coredns-674b8bbfcf-cxswj" Mar 7 00:56:52.217590 kubelet[3248]: I0307 00:56:52.215269 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dpws\" (UniqueName: \"kubernetes.io/projected/2f36088b-c8bd-4e60-bd21-33bf72d4d138-kube-api-access-2dpws\") pod \"whisker-6b786f9cf8-kdqrt\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:52.240920 systemd[1]: Created slice kubepods-besteffort-pod2a289909_3bfe_4f98_9e7f_8f97be86ab84.slice - libcontainer container kubepods-besteffort-pod2a289909_3bfe_4f98_9e7f_8f97be86ab84.slice. Mar 7 00:56:52.265498 systemd[1]: Created slice kubepods-besteffort-pod4134feb0_74d5_456d_b147_4d748b45b820.slice - libcontainer container kubepods-besteffort-pod4134feb0_74d5_456d_b147_4d748b45b820.slice. Mar 7 00:56:52.292515 systemd[1]: Created slice kubepods-besteffort-podd29570d8_449a_4a48_af9f_3d0d19534b53.slice - libcontainer container kubepods-besteffort-podd29570d8_449a_4a48_af9f_3d0d19534b53.slice. Mar 7 00:56:52.308040 systemd[1]: Created slice kubepods-burstable-pod293cb0c8_aea0_4a14_b36c_9f3b77cd23c5.slice - libcontainer container kubepods-burstable-pod293cb0c8_aea0_4a14_b36c_9f3b77cd23c5.slice. Mar 7 00:56:52.315973 kubelet[3248]: I0307 00:56:52.315857 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4134feb0-74d5-456d-b147-4d748b45b820-calico-apiserver-certs\") pod \"calico-apiserver-5f7f748cc6-267tw\" (UID: \"4134feb0-74d5-456d-b147-4d748b45b820\") " pod="calico-system/calico-apiserver-5f7f748cc6-267tw" Mar 7 00:56:52.316547 kubelet[3248]: I0307 00:56:52.316353 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2jkx\" (UniqueName: \"kubernetes.io/projected/4134feb0-74d5-456d-b147-4d748b45b820-kube-api-access-d2jkx\") pod \"calico-apiserver-5f7f748cc6-267tw\" (UID: \"4134feb0-74d5-456d-b147-4d748b45b820\") " pod="calico-system/calico-apiserver-5f7f748cc6-267tw" Mar 7 00:56:52.318728 kubelet[3248]: I0307 00:56:52.318459 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc7b\" (UniqueName: \"kubernetes.io/projected/293cb0c8-aea0-4a14-b36c-9f3b77cd23c5-kube-api-access-xkc7b\") pod \"coredns-674b8bbfcf-6s4mf\" (UID: \"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5\") " pod="kube-system/coredns-674b8bbfcf-6s4mf" Mar 7 00:56:52.319013 kubelet[3248]: I0307 00:56:52.318780 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7tf\" (UniqueName: \"kubernetes.io/projected/d29570d8-449a-4a48-af9f-3d0d19534b53-kube-api-access-ft7tf\") pod \"calico-kube-controllers-69f5fcbdf5-swnh8\" (UID: \"d29570d8-449a-4a48-af9f-3d0d19534b53\") " pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" Mar 7 00:56:52.319292 kubelet[3248]: I0307 00:56:52.319109 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2a289909-3bfe-4f98-9e7f-8f97be86ab84-calico-apiserver-certs\") pod \"calico-apiserver-5f7f748cc6-fd6fv\" (UID: \"2a289909-3bfe-4f98-9e7f-8f97be86ab84\") " pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" Mar 7 00:56:52.319292 kubelet[3248]: I0307 00:56:52.319214 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d29570d8-449a-4a48-af9f-3d0d19534b53-tigera-ca-bundle\") pod \"calico-kube-controllers-69f5fcbdf5-swnh8\" (UID: \"d29570d8-449a-4a48-af9f-3d0d19534b53\") " pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" Mar 7 00:56:52.319292 kubelet[3248]: I0307 00:56:52.319257 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0a4538-d7e9-47a6-8763-f14115b5894f-config\") pod \"goldmane-5b85766d88-nbqjk\" (UID: \"ab0a4538-d7e9-47a6-8763-f14115b5894f\") " pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:52.319770 kubelet[3248]: I0307 00:56:52.319571 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/293cb0c8-aea0-4a14-b36c-9f3b77cd23c5-config-volume\") pod \"coredns-674b8bbfcf-6s4mf\" (UID: \"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5\") " pod="kube-system/coredns-674b8bbfcf-6s4mf" Mar 7 00:56:52.319770 kubelet[3248]: I0307 00:56:52.319655 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm459\" (UniqueName: \"kubernetes.io/projected/2a289909-3bfe-4f98-9e7f-8f97be86ab84-kube-api-access-cm459\") pod \"calico-apiserver-5f7f748cc6-fd6fv\" (UID: \"2a289909-3bfe-4f98-9e7f-8f97be86ab84\") " pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" Mar 7 00:56:52.319770 kubelet[3248]: I0307 00:56:52.319726 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br77k\" (UniqueName: \"kubernetes.io/projected/ab0a4538-d7e9-47a6-8763-f14115b5894f-kube-api-access-br77k\") pod \"goldmane-5b85766d88-nbqjk\" (UID: \"ab0a4538-d7e9-47a6-8763-f14115b5894f\") " pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:52.320172 kubelet[3248]: I0307 00:56:52.320021 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab0a4538-d7e9-47a6-8763-f14115b5894f-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-nbqjk\" (UID: \"ab0a4538-d7e9-47a6-8763-f14115b5894f\") " pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:52.320172 kubelet[3248]: I0307 00:56:52.320103 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ab0a4538-d7e9-47a6-8763-f14115b5894f-goldmane-key-pair\") pod \"goldmane-5b85766d88-nbqjk\" (UID: \"ab0a4538-d7e9-47a6-8763-f14115b5894f\") " pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:52.330237 systemd[1]: Created slice kubepods-besteffort-podab0a4538_d7e9_47a6_8763_f14115b5894f.slice - libcontainer container kubepods-besteffort-podab0a4538_d7e9_47a6_8763_f14115b5894f.slice. Mar 7 00:56:52.532267 containerd[2014]: time="2026-03-07T00:56:52.531568672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxswj,Uid:d7c61432-fbce-407b-bac9-6d0016af3dc6,Namespace:kube-system,Attempt:0,}" Mar 7 00:56:52.537695 containerd[2014]: time="2026-03-07T00:56:52.537641248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b786f9cf8-kdqrt,Uid:2f36088b-c8bd-4e60-bd21-33bf72d4d138,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:52.561348 containerd[2014]: time="2026-03-07T00:56:52.560777356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-fd6fv,Uid:2a289909-3bfe-4f98-9e7f-8f97be86ab84,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:52.598858 containerd[2014]: time="2026-03-07T00:56:52.598439476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-267tw,Uid:4134feb0-74d5-456d-b147-4d748b45b820,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:52.611308 containerd[2014]: time="2026-03-07T00:56:52.610963264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69f5fcbdf5-swnh8,Uid:d29570d8-449a-4a48-af9f-3d0d19534b53,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:52.633088 containerd[2014]: time="2026-03-07T00:56:52.631616704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6s4mf,Uid:293cb0c8-aea0-4a14-b36c-9f3b77cd23c5,Namespace:kube-system,Attempt:0,}" Mar 7 00:56:52.646672 containerd[2014]: time="2026-03-07T00:56:52.646591852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nbqjk,Uid:ab0a4538-d7e9-47a6-8763-f14115b5894f,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:53.084680 containerd[2014]: time="2026-03-07T00:56:53.084569570Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 00:56:53.098708 containerd[2014]: time="2026-03-07T00:56:53.098493698Z" level=error msg="Failed to destroy network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.101727 containerd[2014]: time="2026-03-07T00:56:53.101634806Z" level=error msg="encountered an error cleaning up failed sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.101892 containerd[2014]: time="2026-03-07T00:56:53.101752502Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-fd6fv,Uid:2a289909-3bfe-4f98-9e7f-8f97be86ab84,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.102123 kubelet[3248]: E0307 00:56:53.102044 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.106090 kubelet[3248]: E0307 00:56:53.102156 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" Mar 7 00:56:53.106090 kubelet[3248]: E0307 00:56:53.102212 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" Mar 7 00:56:53.106090 kubelet[3248]: E0307 00:56:53.102311 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f7f748cc6-fd6fv_calico-system(2a289909-3bfe-4f98-9e7f-8f97be86ab84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f7f748cc6-fd6fv_calico-system(2a289909-3bfe-4f98-9e7f-8f97be86ab84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" podUID="2a289909-3bfe-4f98-9e7f-8f97be86ab84" Mar 7 00:56:53.246270 containerd[2014]: time="2026-03-07T00:56:53.246183339Z" level=info msg="CreateContainer within sandbox \"988047ffb9d5e0c217ff47171e5407c408bd8081e9671fef6220389325132c47\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae\"" Mar 7 00:56:53.251952 containerd[2014]: time="2026-03-07T00:56:53.251892195Z" level=info msg="StartContainer for \"1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae\"" Mar 7 00:56:53.256609 containerd[2014]: time="2026-03-07T00:56:53.253853691Z" level=error msg="Failed to destroy network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.257796 containerd[2014]: time="2026-03-07T00:56:53.257716263Z" level=error msg="encountered an error cleaning up failed sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.258458 containerd[2014]: time="2026-03-07T00:56:53.258381219Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxswj,Uid:d7c61432-fbce-407b-bac9-6d0016af3dc6,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.262484 kubelet[3248]: E0307 00:56:53.262408 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.262633 kubelet[3248]: E0307 00:56:53.262502 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cxswj" Mar 7 00:56:53.262633 kubelet[3248]: E0307 00:56:53.262538 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cxswj" Mar 7 00:56:53.265921 kubelet[3248]: E0307 00:56:53.262614 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-cxswj_kube-system(d7c61432-fbce-407b-bac9-6d0016af3dc6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-cxswj_kube-system(d7c61432-fbce-407b-bac9-6d0016af3dc6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-cxswj" podUID="d7c61432-fbce-407b-bac9-6d0016af3dc6" Mar 7 00:56:53.270451 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597-shm.mount: Deactivated successfully. Mar 7 00:56:53.294412 containerd[2014]: time="2026-03-07T00:56:53.293914659Z" level=error msg="Failed to destroy network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.301894 containerd[2014]: time="2026-03-07T00:56:53.301709907Z" level=error msg="encountered an error cleaning up failed sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.301894 containerd[2014]: time="2026-03-07T00:56:53.301812771Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b786f9cf8-kdqrt,Uid:2f36088b-c8bd-4e60-bd21-33bf72d4d138,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.302510 containerd[2014]: time="2026-03-07T00:56:53.302241987Z" level=error msg="Failed to destroy network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.304735 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5-shm.mount: Deactivated successfully. Mar 7 00:56:53.308483 kubelet[3248]: E0307 00:56:53.308396 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.308660 kubelet[3248]: E0307 00:56:53.308491 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:53.308660 kubelet[3248]: E0307 00:56:53.308526 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b786f9cf8-kdqrt" Mar 7 00:56:53.308777 kubelet[3248]: E0307 00:56:53.308670 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6b786f9cf8-kdqrt_calico-system(2f36088b-c8bd-4e60-bd21-33bf72d4d138)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6b786f9cf8-kdqrt_calico-system(2f36088b-c8bd-4e60-bd21-33bf72d4d138)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b786f9cf8-kdqrt" podUID="2f36088b-c8bd-4e60-bd21-33bf72d4d138" Mar 7 00:56:53.313730 containerd[2014]: time="2026-03-07T00:56:53.313141911Z" level=error msg="encountered an error cleaning up failed sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.316707 containerd[2014]: time="2026-03-07T00:56:53.316592235Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69f5fcbdf5-swnh8,Uid:d29570d8-449a-4a48-af9f-3d0d19534b53,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.317530 kubelet[3248]: E0307 00:56:53.317478 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.317738 kubelet[3248]: E0307 00:56:53.317706 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" Mar 7 00:56:53.319176 kubelet[3248]: E0307 00:56:53.317859 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" Mar 7 00:56:53.319176 kubelet[3248]: E0307 00:56:53.317951 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69f5fcbdf5-swnh8_calico-system(d29570d8-449a-4a48-af9f-3d0d19534b53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69f5fcbdf5-swnh8_calico-system(d29570d8-449a-4a48-af9f-3d0d19534b53)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" podUID="d29570d8-449a-4a48-af9f-3d0d19534b53" Mar 7 00:56:53.321291 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0-shm.mount: Deactivated successfully. Mar 7 00:56:53.350620 containerd[2014]: time="2026-03-07T00:56:53.349471960Z" level=error msg="Failed to destroy network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.350620 containerd[2014]: time="2026-03-07T00:56:53.350353576Z" level=error msg="encountered an error cleaning up failed sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.350620 containerd[2014]: time="2026-03-07T00:56:53.350455732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6s4mf,Uid:293cb0c8-aea0-4a14-b36c-9f3b77cd23c5,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.351281 kubelet[3248]: E0307 00:56:53.351064 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.353173 kubelet[3248]: E0307 00:56:53.353114 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6s4mf" Mar 7 00:56:53.353857 kubelet[3248]: E0307 00:56:53.353361 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6s4mf" Mar 7 00:56:53.353857 kubelet[3248]: E0307 00:56:53.353462 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-6s4mf_kube-system(293cb0c8-aea0-4a14-b36c-9f3b77cd23c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-6s4mf_kube-system(293cb0c8-aea0-4a14-b36c-9f3b77cd23c5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-6s4mf" podUID="293cb0c8-aea0-4a14-b36c-9f3b77cd23c5" Mar 7 00:56:53.361831 containerd[2014]: time="2026-03-07T00:56:53.361632880Z" level=error msg="Failed to destroy network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.363104 containerd[2014]: time="2026-03-07T00:56:53.362846812Z" level=error msg="encountered an error cleaning up failed sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.363104 containerd[2014]: time="2026-03-07T00:56:53.362953924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nbqjk,Uid:ab0a4538-d7e9-47a6-8763-f14115b5894f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.363432 kubelet[3248]: E0307 00:56:53.363275 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.363432 kubelet[3248]: E0307 00:56:53.363413 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:53.363595 kubelet[3248]: E0307 00:56:53.363474 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-nbqjk" Mar 7 00:56:53.363662 kubelet[3248]: E0307 00:56:53.363584 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-nbqjk_calico-system(ab0a4538-d7e9-47a6-8763-f14115b5894f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-nbqjk_calico-system(ab0a4538-d7e9-47a6-8763-f14115b5894f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-nbqjk" podUID="ab0a4538-d7e9-47a6-8763-f14115b5894f" Mar 7 00:56:53.402794 containerd[2014]: time="2026-03-07T00:56:53.402736864Z" level=error msg="Failed to destroy network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.403883 containerd[2014]: time="2026-03-07T00:56:53.403628428Z" level=error msg="encountered an error cleaning up failed sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.403883 containerd[2014]: time="2026-03-07T00:56:53.403714012Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-267tw,Uid:4134feb0-74d5-456d-b147-4d748b45b820,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.405298 kubelet[3248]: E0307 00:56:53.404289 3248 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:56:53.405298 kubelet[3248]: E0307 00:56:53.404373 3248 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5f7f748cc6-267tw" Mar 7 00:56:53.405298 kubelet[3248]: E0307 00:56:53.404409 3248 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5f7f748cc6-267tw" Mar 7 00:56:53.405897 kubelet[3248]: E0307 00:56:53.404484 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f7f748cc6-267tw_calico-system(4134feb0-74d5-456d-b147-4d748b45b820)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f7f748cc6-267tw_calico-system(4134feb0-74d5-456d-b147-4d748b45b820)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5f7f748cc6-267tw" podUID="4134feb0-74d5-456d-b147-4d748b45b820" Mar 7 00:56:53.415517 systemd[1]: Started cri-containerd-1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae.scope - libcontainer container 1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae. Mar 7 00:56:53.474459 containerd[2014]: time="2026-03-07T00:56:53.474261088Z" level=info msg="StartContainer for \"1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae\" returns successfully" Mar 7 00:56:53.798884 systemd[1]: Created slice kubepods-besteffort-pode10fee9c_8cdc_4998_ad44_21b04777940b.slice - libcontainer container kubepods-besteffort-pode10fee9c_8cdc_4998_ad44_21b04777940b.slice. Mar 7 00:56:53.807538 containerd[2014]: time="2026-03-07T00:56:53.805905342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6p9l4,Uid:e10fee9c-8cdc-4998-ad44-21b04777940b,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:54.025681 kubelet[3248]: I0307 00:56:54.025600 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:56:54.028524 containerd[2014]: time="2026-03-07T00:56:54.028451871Z" level=info msg="StopPodSandbox for \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\"" Mar 7 00:56:54.030835 containerd[2014]: time="2026-03-07T00:56:54.030770535Z" level=info msg="Ensure that sandbox aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5 in task-service has been cleanup successfully" Mar 7 00:56:54.033156 kubelet[3248]: I0307 00:56:54.032851 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:56:54.037078 containerd[2014]: time="2026-03-07T00:56:54.036452859Z" level=info msg="StopPodSandbox for \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\"" Mar 7 00:56:54.037078 containerd[2014]: time="2026-03-07T00:56:54.036749907Z" level=info msg="Ensure that sandbox 2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597 in task-service has been cleanup successfully" Mar 7 00:56:54.066159 kubelet[3248]: I0307 00:56:54.064922 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:56:54.069342 containerd[2014]: time="2026-03-07T00:56:54.069262971Z" level=info msg="StopPodSandbox for \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\"" Mar 7 00:56:54.069609 containerd[2014]: time="2026-03-07T00:56:54.069555459Z" level=info msg="Ensure that sandbox 0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0 in task-service has been cleanup successfully" Mar 7 00:56:54.073220 kubelet[3248]: I0307 00:56:54.073039 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:56:54.080480 containerd[2014]: time="2026-03-07T00:56:54.080257035Z" level=info msg="StopPodSandbox for \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\"" Mar 7 00:56:54.086035 containerd[2014]: time="2026-03-07T00:56:54.085594491Z" level=info msg="Ensure that sandbox 8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378 in task-service has been cleanup successfully" Mar 7 00:56:54.097059 kubelet[3248]: I0307 00:56:54.097022 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:56:54.098969 containerd[2014]: time="2026-03-07T00:56:54.098238615Z" level=info msg="StopPodSandbox for \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\"" Mar 7 00:56:54.098969 containerd[2014]: time="2026-03-07T00:56:54.098546811Z" level=info msg="Ensure that sandbox 220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887 in task-service has been cleanup successfully" Mar 7 00:56:54.115855 kubelet[3248]: I0307 00:56:54.115793 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:56:54.122525 containerd[2014]: time="2026-03-07T00:56:54.122448735Z" level=info msg="StopPodSandbox for \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\"" Mar 7 00:56:54.123238 containerd[2014]: time="2026-03-07T00:56:54.122771811Z" level=info msg="Ensure that sandbox e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281 in task-service has been cleanup successfully" Mar 7 00:56:54.154708 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887-shm.mount: Deactivated successfully. Mar 7 00:56:54.154913 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378-shm.mount: Deactivated successfully. Mar 7 00:56:54.155089 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281-shm.mount: Deactivated successfully. Mar 7 00:56:54.157185 kubelet[3248]: I0307 00:56:54.157105 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-h2t96" podStartSLOduration=5.688103499 podStartE2EDuration="21.1570813s" podCreationTimestamp="2026-03-07 00:56:33 +0000 UTC" firstStartedPulling="2026-03-07 00:56:34.52864135 +0000 UTC m=+27.046667475" lastFinishedPulling="2026-03-07 00:56:49.997619163 +0000 UTC m=+42.515645276" observedRunningTime="2026-03-07 00:56:54.142818988 +0000 UTC m=+46.660845137" watchObservedRunningTime="2026-03-07 00:56:54.1570813 +0000 UTC m=+46.675107425" Mar 7 00:56:54.174772 kubelet[3248]: I0307 00:56:54.174444 3248 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:56:54.177256 containerd[2014]: time="2026-03-07T00:56:54.176898832Z" level=info msg="StopPodSandbox for \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\"" Mar 7 00:56:54.180792 containerd[2014]: time="2026-03-07T00:56:54.179803528Z" level=info msg="Ensure that sandbox 10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a in task-service has been cleanup successfully" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.476 [INFO][4684] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.479 [INFO][4684] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" iface="eth0" netns="/var/run/netns/cni-b5bf986c-e939-98ac-ce05-0ebc0363c8ba" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.480 [INFO][4684] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" iface="eth0" netns="/var/run/netns/cni-b5bf986c-e939-98ac-ce05-0ebc0363c8ba" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.481 [INFO][4684] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" iface="eth0" netns="/var/run/netns/cni-b5bf986c-e939-98ac-ce05-0ebc0363c8ba" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.481 [INFO][4684] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.481 [INFO][4684] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.735 [INFO][4726] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.750 [INFO][4726] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.774 [INFO][4726] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.805 [WARNING][4726] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.805 [INFO][4726] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.815 [INFO][4726] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:54.850061 containerd[2014]: 2026-03-07 00:56:54.831 [INFO][4684] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:56:54.855651 containerd[2014]: time="2026-03-07T00:56:54.855322231Z" level=info msg="TearDown network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" successfully" Mar 7 00:56:54.855651 containerd[2014]: time="2026-03-07T00:56:54.855375415Z" level=info msg="StopPodSandbox for \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" returns successfully" Mar 7 00:56:54.863387 systemd[1]: run-netns-cni\x2db5bf986c\x2de939\x2d98ac\x2dce05\x2d0ebc0363c8ba.mount: Deactivated successfully. Mar 7 00:56:54.864849 containerd[2014]: time="2026-03-07T00:56:54.864322015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6s4mf,Uid:293cb0c8-aea0-4a14-b36c-9f3b77cd23c5,Namespace:kube-system,Attempt:1,}" Mar 7 00:56:54.877883 (udev-worker)[4777]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:56:54.886544 systemd-networkd[1864]: cali4abbca1f96f: Link UP Mar 7 00:56:54.888575 systemd-networkd[1864]: cali4abbca1f96f: Gained carrier Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.580 [INFO][4683] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.581 [INFO][4683] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" iface="eth0" netns="/var/run/netns/cni-8927ae0c-15a2-f064-8e44-e98c8824a280" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.581 [INFO][4683] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" iface="eth0" netns="/var/run/netns/cni-8927ae0c-15a2-f064-8e44-e98c8824a280" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.583 [INFO][4683] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" iface="eth0" netns="/var/run/netns/cni-8927ae0c-15a2-f064-8e44-e98c8824a280" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.583 [INFO][4683] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.583 [INFO][4683] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.797 [INFO][4741] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.798 [INFO][4741] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.815 [INFO][4741] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.852 [WARNING][4741] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.852 [INFO][4741] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.860 [INFO][4741] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:54.931933 containerd[2014]: 2026-03-07 00:56:54.915 [INFO][4683] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:56:54.938332 containerd[2014]: time="2026-03-07T00:56:54.937332643Z" level=info msg="TearDown network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" successfully" Mar 7 00:56:54.950758 systemd[1]: run-netns-cni\x2d8927ae0c\x2d15a2\x2df064\x2d8e44\x2de98c8824a280.mount: Deactivated successfully. Mar 7 00:56:54.956736 containerd[2014]: time="2026-03-07T00:56:54.941467603Z" level=info msg="StopPodSandbox for \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" returns successfully" Mar 7 00:56:54.960336 containerd[2014]: time="2026-03-07T00:56:54.960279524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69f5fcbdf5-swnh8,Uid:d29570d8-449a-4a48-af9f-3d0d19534b53,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:53.893 [ERROR][4597] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:53.959 [INFO][4597] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0 csi-node-driver- calico-system e10fee9c-8cdc-4998-ad44-21b04777940b 767 0 2026-03-07 00:56:34 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-19-200 csi-node-driver-6p9l4 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4abbca1f96f [] [] }} ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:53.960 [INFO][4597] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.415 [INFO][4610] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" HandleID="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Workload="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.491 [INFO][4610] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" HandleID="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Workload="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fdb30), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"csi-node-driver-6p9l4", "timestamp":"2026-03-07 00:56:54.415034957 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400017a420)} Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.491 [INFO][4610] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.492 [INFO][4610] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.495 [INFO][4610] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.512 [INFO][4610] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.550 [INFO][4610] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.640 [INFO][4610] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.681 [INFO][4610] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.709 [INFO][4610] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.711 [INFO][4610] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.716 [INFO][4610] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298 Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.745 [INFO][4610] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.771 [INFO][4610] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.129/26] block=192.168.31.128/26 handle="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.771 [INFO][4610] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.129/26] handle="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" host="ip-172-31-19-200" Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.771 [INFO][4610] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:54.999436 containerd[2014]: 2026-03-07 00:56:54.771 [INFO][4610] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.129/26] IPv6=[] ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" HandleID="k8s-pod-network.68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Workload="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.831 [INFO][4597] cni-plugin/k8s.go 418: Populated endpoint ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e10fee9c-8cdc-4998-ad44-21b04777940b", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"csi-node-driver-6p9l4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.31.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4abbca1f96f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.831 [INFO][4597] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.129/32] ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.831 [INFO][4597] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4abbca1f96f ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.913 [INFO][4597] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.925 [INFO][4597] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e10fee9c-8cdc-4998-ad44-21b04777940b", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298", Pod:"csi-node-driver-6p9l4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.31.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4abbca1f96f", MAC:"ae:17:e7:0a:74:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:55.002727 containerd[2014]: 2026-03-07 00:56:54.948 [INFO][4597] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298" Namespace="calico-system" Pod="csi-node-driver-6p9l4" WorkloadEndpoint="ip--172--31--19--200-k8s-csi--node--driver--6p9l4-eth0" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.571 [INFO][4696] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.572 [INFO][4696] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" iface="eth0" netns="/var/run/netns/cni-477fe24d-8129-abe3-d517-a01f065a4034" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.573 [INFO][4696] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" iface="eth0" netns="/var/run/netns/cni-477fe24d-8129-abe3-d517-a01f065a4034" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.575 [INFO][4696] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" iface="eth0" netns="/var/run/netns/cni-477fe24d-8129-abe3-d517-a01f065a4034" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.575 [INFO][4696] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.575 [INFO][4696] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.899 [INFO][4739] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.900 [INFO][4739] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.900 [INFO][4739] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.981 [WARNING][4739] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.981 [INFO][4739] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:54.992 [INFO][4739] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:55.054812 containerd[2014]: 2026-03-07 00:56:55.047 [INFO][4696] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:56:55.063289 containerd[2014]: time="2026-03-07T00:56:55.062021344Z" level=info msg="TearDown network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" successfully" Mar 7 00:56:55.063289 containerd[2014]: time="2026-03-07T00:56:55.062094748Z" level=info msg="StopPodSandbox for \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" returns successfully" Mar 7 00:56:55.064130 containerd[2014]: time="2026-03-07T00:56:55.063908164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-fd6fv,Uid:2a289909-3bfe-4f98-9e7f-8f97be86ab84,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:55.154331 systemd[1]: run-netns-cni\x2d477fe24d\x2d8129\x2dabe3\x2dd517\x2da01f065a4034.mount: Deactivated successfully. Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.649 [INFO][4641] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.650 [INFO][4641] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" iface="eth0" netns="/var/run/netns/cni-5dabfcbc-25bf-a975-3608-80e0041d8066" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.651 [INFO][4641] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" iface="eth0" netns="/var/run/netns/cni-5dabfcbc-25bf-a975-3608-80e0041d8066" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.651 [INFO][4641] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" iface="eth0" netns="/var/run/netns/cni-5dabfcbc-25bf-a975-3608-80e0041d8066" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.652 [INFO][4641] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.652 [INFO][4641] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.897 [INFO][4750] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.907 [INFO][4750] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:54.997 [INFO][4750] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:55.055 [WARNING][4750] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:55.059 [INFO][4750] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:55.093 [INFO][4750] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:55.164243 containerd[2014]: 2026-03-07 00:56:55.124 [INFO][4641] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:56:55.166769 containerd[2014]: time="2026-03-07T00:56:55.166502177Z" level=info msg="TearDown network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" successfully" Mar 7 00:56:55.166769 containerd[2014]: time="2026-03-07T00:56:55.166558697Z" level=info msg="StopPodSandbox for \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" returns successfully" Mar 7 00:56:55.177351 systemd[1]: run-netns-cni\x2d5dabfcbc\x2d25bf\x2da975\x2d3608\x2d80e0041d8066.mount: Deactivated successfully. Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.685 [INFO][4682] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.694 [INFO][4682] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" iface="eth0" netns="/var/run/netns/cni-fbced3e9-eabe-8a9b-6269-a1584dadfca0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.695 [INFO][4682] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" iface="eth0" netns="/var/run/netns/cni-fbced3e9-eabe-8a9b-6269-a1584dadfca0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.698 [INFO][4682] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" iface="eth0" netns="/var/run/netns/cni-fbced3e9-eabe-8a9b-6269-a1584dadfca0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.699 [INFO][4682] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:54.700 [INFO][4682] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.044 [INFO][4756] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.044 [INFO][4756] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.093 [INFO][4756] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.128 [WARNING][4756] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.128 [INFO][4756] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.137 [INFO][4756] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:55.216025 containerd[2014]: 2026-03-07 00:56:55.197 [INFO][4682] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:56:55.218711 containerd[2014]: time="2026-03-07T00:56:55.217440305Z" level=info msg="TearDown network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" successfully" Mar 7 00:56:55.218711 containerd[2014]: time="2026-03-07T00:56:55.217499765Z" level=info msg="StopPodSandbox for \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" returns successfully" Mar 7 00:56:55.226468 containerd[2014]: time="2026-03-07T00:56:55.226112453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nbqjk,Uid:ab0a4538-d7e9-47a6-8763-f14115b5894f,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:55.228425 systemd[1]: run-netns-cni\x2dfbced3e9\x2deabe\x2d8a9b\x2d6269\x2da1584dadfca0.mount: Deactivated successfully. Mar 7 00:56:55.238306 containerd[2014]: time="2026-03-07T00:56:55.237416441Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:55.238306 containerd[2014]: time="2026-03-07T00:56:55.237556001Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:55.238306 containerd[2014]: time="2026-03-07T00:56:55.237594857Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:55.238306 containerd[2014]: time="2026-03-07T00:56:55.237787469Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:55.280621 systemd[1]: Started cri-containerd-68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298.scope - libcontainer container 68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298. Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.701 [INFO][4649] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.705 [INFO][4649] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" iface="eth0" netns="/var/run/netns/cni-eecbf6da-c762-df75-9577-3a9947f79f78" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.706 [INFO][4649] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" iface="eth0" netns="/var/run/netns/cni-eecbf6da-c762-df75-9577-3a9947f79f78" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.717 [INFO][4649] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" iface="eth0" netns="/var/run/netns/cni-eecbf6da-c762-df75-9577-3a9947f79f78" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.717 [INFO][4649] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:54.717 [INFO][4649] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.060 [INFO][4759] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.061 [INFO][4759] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.164 [INFO][4759] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.230 [WARNING][4759] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.230 [INFO][4759] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.251 [INFO][4759] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:55.337433 containerd[2014]: 2026-03-07 00:56:55.318 [INFO][4649] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:56:55.349604 containerd[2014]: time="2026-03-07T00:56:55.349529286Z" level=info msg="TearDown network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" successfully" Mar 7 00:56:55.356406 containerd[2014]: time="2026-03-07T00:56:55.356241858Z" level=info msg="StopPodSandbox for \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" returns successfully" Mar 7 00:56:55.386247 containerd[2014]: time="2026-03-07T00:56:55.385646346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxswj,Uid:d7c61432-fbce-407b-bac9-6d0016af3dc6,Namespace:kube-system,Attempt:1,}" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.710 [INFO][4678] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.710 [INFO][4678] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" iface="eth0" netns="/var/run/netns/cni-3b76f183-006b-0d44-f66f-e520df153d95" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.711 [INFO][4678] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" iface="eth0" netns="/var/run/netns/cni-3b76f183-006b-0d44-f66f-e520df153d95" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.726 [INFO][4678] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" iface="eth0" netns="/var/run/netns/cni-3b76f183-006b-0d44-f66f-e520df153d95" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.727 [INFO][4678] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:54.727 [INFO][4678] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.204 [INFO][4764] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.215 [INFO][4764] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.265 [INFO][4764] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.320 [WARNING][4764] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.320 [INFO][4764] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.329 [INFO][4764] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:55.396592 containerd[2014]: 2026-03-07 00:56:55.360 [INFO][4678] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:56:55.413073 containerd[2014]: time="2026-03-07T00:56:55.409140174Z" level=info msg="TearDown network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" successfully" Mar 7 00:56:55.413073 containerd[2014]: time="2026-03-07T00:56:55.409237566Z" level=info msg="StopPodSandbox for \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" returns successfully" Mar 7 00:56:55.418366 containerd[2014]: time="2026-03-07T00:56:55.418067982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-267tw,Uid:4134feb0-74d5-456d-b147-4d748b45b820,Namespace:calico-system,Attempt:1,}" Mar 7 00:56:55.419516 containerd[2014]: time="2026-03-07T00:56:55.419066562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6p9l4,Uid:e10fee9c-8cdc-4998-ad44-21b04777940b,Namespace:calico-system,Attempt:0,} returns sandbox id \"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298\"" Mar 7 00:56:55.441930 containerd[2014]: time="2026-03-07T00:56:55.441602166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 00:56:55.472027 kubelet[3248]: I0307 00:56:55.469983 3248 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-ca-bundle\") pod \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " Mar 7 00:56:55.473385 kubelet[3248]: I0307 00:56:55.473175 3248 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-nginx-config\") pod \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " Mar 7 00:56:55.474217 kubelet[3248]: I0307 00:56:55.473912 3248 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dpws\" (UniqueName: \"kubernetes.io/projected/2f36088b-c8bd-4e60-bd21-33bf72d4d138-kube-api-access-2dpws\") pod \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " Mar 7 00:56:55.475984 kubelet[3248]: I0307 00:56:55.470738 3248 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2f36088b-c8bd-4e60-bd21-33bf72d4d138" (UID: "2f36088b-c8bd-4e60-bd21-33bf72d4d138"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:56:55.480219 kubelet[3248]: I0307 00:56:55.478047 3248 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-backend-key-pair\") pod \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\" (UID: \"2f36088b-c8bd-4e60-bd21-33bf72d4d138\") " Mar 7 00:56:55.487914 kubelet[3248]: I0307 00:56:55.487077 3248 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "2f36088b-c8bd-4e60-bd21-33bf72d4d138" (UID: "2f36088b-c8bd-4e60-bd21-33bf72d4d138"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:56:55.511684 kubelet[3248]: I0307 00:56:55.511467 3248 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f36088b-c8bd-4e60-bd21-33bf72d4d138-kube-api-access-2dpws" (OuterVolumeSpecName: "kube-api-access-2dpws") pod "2f36088b-c8bd-4e60-bd21-33bf72d4d138" (UID: "2f36088b-c8bd-4e60-bd21-33bf72d4d138"). InnerVolumeSpecName "kube-api-access-2dpws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 00:56:55.520485 kubelet[3248]: I0307 00:56:55.520306 3248 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2f36088b-c8bd-4e60-bd21-33bf72d4d138" (UID: "2f36088b-c8bd-4e60-bd21-33bf72d4d138"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 00:56:55.582572 kubelet[3248]: I0307 00:56:55.581293 3248 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2dpws\" (UniqueName: \"kubernetes.io/projected/2f36088b-c8bd-4e60-bd21-33bf72d4d138-kube-api-access-2dpws\") on node \"ip-172-31-19-200\" DevicePath \"\"" Mar 7 00:56:55.582572 kubelet[3248]: I0307 00:56:55.581346 3248 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-backend-key-pair\") on node \"ip-172-31-19-200\" DevicePath \"\"" Mar 7 00:56:55.582572 kubelet[3248]: I0307 00:56:55.581372 3248 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-whisker-ca-bundle\") on node \"ip-172-31-19-200\" DevicePath \"\"" Mar 7 00:56:55.582572 kubelet[3248]: I0307 00:56:55.581395 3248 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2f36088b-c8bd-4e60-bd21-33bf72d4d138-nginx-config\") on node \"ip-172-31-19-200\" DevicePath \"\"" Mar 7 00:56:55.824953 systemd[1]: Removed slice kubepods-besteffort-pod2f36088b_c8bd_4e60_bd21_33bf72d4d138.slice - libcontainer container kubepods-besteffort-pod2f36088b_c8bd_4e60_bd21_33bf72d4d138.slice. Mar 7 00:56:55.979489 (udev-worker)[4776]: Network interface NamePolicy= disabled on kernel command line. Mar 7 00:56:56.005977 systemd-networkd[1864]: calide84ecc1233: Link UP Mar 7 00:56:56.025496 systemd-networkd[1864]: calide84ecc1233: Gained carrier Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.340 [ERROR][4798] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.400 [INFO][4798] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0 calico-kube-controllers-69f5fcbdf5- calico-system d29570d8-449a-4a48-af9f-3d0d19534b53 943 0 2026-03-07 00:56:34 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69f5fcbdf5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-19-200 calico-kube-controllers-69f5fcbdf5-swnh8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calide84ecc1233 [] [] }} ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.400 [INFO][4798] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.643 [INFO][4883] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" HandleID="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.715 [INFO][4883] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" HandleID="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400063c0e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"calico-kube-controllers-69f5fcbdf5-swnh8", "timestamp":"2026-03-07 00:56:55.643106239 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000e02c0)} Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.715 [INFO][4883] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.715 [INFO][4883] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.715 [INFO][4883] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.727 [INFO][4883] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.765 [INFO][4883] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.811 [INFO][4883] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.832 [INFO][4883] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.859 [INFO][4883] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.859 [INFO][4883] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.865 [INFO][4883] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877 Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.920 [INFO][4883] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.952 [INFO][4883] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.130/26] block=192.168.31.128/26 handle="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.952 [INFO][4883] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.130/26] handle="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" host="ip-172-31-19-200" Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.954 [INFO][4883] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:56.141579 containerd[2014]: 2026-03-07 00:56:55.955 [INFO][4883] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.130/26] IPv6=[] ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" HandleID="k8s-pod-network.1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:55.968 [INFO][4798] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0", GenerateName:"calico-kube-controllers-69f5fcbdf5-", Namespace:"calico-system", SelfLink:"", UID:"d29570d8-449a-4a48-af9f-3d0d19534b53", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69f5fcbdf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"calico-kube-controllers-69f5fcbdf5-swnh8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide84ecc1233", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:55.969 [INFO][4798] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.130/32] ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:55.970 [INFO][4798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calide84ecc1233 ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:56.036 [INFO][4798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:56.043 [INFO][4798] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0", GenerateName:"calico-kube-controllers-69f5fcbdf5-", Namespace:"calico-system", SelfLink:"", UID:"d29570d8-449a-4a48-af9f-3d0d19534b53", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69f5fcbdf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877", Pod:"calico-kube-controllers-69f5fcbdf5-swnh8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide84ecc1233", MAC:"66:2c:5f:7b:1d:69", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.144639 containerd[2014]: 2026-03-07 00:56:56.094 [INFO][4798] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877" Namespace="calico-system" Pod="calico-kube-controllers-69f5fcbdf5-swnh8" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:56:56.191719 systemd[1]: run-netns-cni\x2d3b76f183\x2d006b\x2d0d44\x2df66f\x2de520df153d95.mount: Deactivated successfully. Mar 7 00:56:56.191913 systemd[1]: run-netns-cni\x2deecbf6da\x2dc762\x2ddf75\x2d9577\x2d3a9947f79f78.mount: Deactivated successfully. Mar 7 00:56:56.192061 systemd[1]: var-lib-kubelet-pods-2f36088b\x2dc8bd\x2d4e60\x2dbd21\x2d33bf72d4d138-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2dpws.mount: Deactivated successfully. Mar 7 00:56:56.192226 systemd[1]: var-lib-kubelet-pods-2f36088b\x2dc8bd\x2d4e60\x2dbd21\x2d33bf72d4d138-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 00:56:56.331543 systemd-networkd[1864]: cali4abbca1f96f: Gained IPv6LL Mar 7 00:56:56.376151 systemd-networkd[1864]: calic6cc9e85ad1: Link UP Mar 7 00:56:56.384739 systemd-networkd[1864]: calic6cc9e85ad1: Gained carrier Mar 7 00:56:56.469486 containerd[2014]: time="2026-03-07T00:56:56.468696391Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:56.469486 containerd[2014]: time="2026-03-07T00:56:56.468813979Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:56.469486 containerd[2014]: time="2026-03-07T00:56:56.468871795Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:56.474499 containerd[2014]: time="2026-03-07T00:56:56.473255911Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:56.534971 systemd[1]: Created slice kubepods-besteffort-podf837da8f_bc82_4be5_9094_8e73ba2c57a3.slice - libcontainer container kubepods-besteffort-podf837da8f_bc82_4be5_9094_8e73ba2c57a3.slice. Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.383 [ERROR][4782] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.474 [INFO][4782] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0 coredns-674b8bbfcf- kube-system 293cb0c8-aea0-4a14-b36c-9f3b77cd23c5 941 0 2026-03-07 00:56:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-19-200 coredns-674b8bbfcf-6s4mf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic6cc9e85ad1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.474 [INFO][4782] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.856 [INFO][4917] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" HandleID="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.983 [INFO][4917] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" HandleID="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d700), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-19-200", "pod":"coredns-674b8bbfcf-6s4mf", "timestamp":"2026-03-07 00:56:55.856186628 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400044da20)} Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.983 [INFO][4917] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.983 [INFO][4917] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:55.983 [INFO][4917] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.020 [INFO][4917] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.058 [INFO][4917] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.086 [INFO][4917] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.113 [INFO][4917] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.130 [INFO][4917] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.130 [INFO][4917] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.147 [INFO][4917] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8 Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.266 [INFO][4917] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.297 [INFO][4917] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.131/26] block=192.168.31.128/26 handle="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.297 [INFO][4917] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.131/26] handle="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" host="ip-172-31-19-200" Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.299 [INFO][4917] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:56.570314 containerd[2014]: 2026-03-07 00:56:56.299 [INFO][4917] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.131/26] IPv6=[] ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" HandleID="k8s-pod-network.ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.340 [INFO][4782] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"coredns-674b8bbfcf-6s4mf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic6cc9e85ad1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.356 [INFO][4782] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.131/32] ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.356 [INFO][4782] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic6cc9e85ad1 ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.389 [INFO][4782] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.402 [INFO][4782] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8", Pod:"coredns-674b8bbfcf-6s4mf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic6cc9e85ad1", MAC:"4e:62:90:53:2b:8d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.573522 containerd[2014]: 2026-03-07 00:56:56.522 [INFO][4782] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8" Namespace="kube-system" Pod="coredns-674b8bbfcf-6s4mf" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:56:56.593382 kubelet[3248]: I0307 00:56:56.592555 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/f837da8f-bc82-4be5-9094-8e73ba2c57a3-nginx-config\") pod \"whisker-56f5bf9455-h6wbh\" (UID: \"f837da8f-bc82-4be5-9094-8e73ba2c57a3\") " pod="calico-system/whisker-56f5bf9455-h6wbh" Mar 7 00:56:56.593382 kubelet[3248]: I0307 00:56:56.592621 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f837da8f-bc82-4be5-9094-8e73ba2c57a3-whisker-ca-bundle\") pod \"whisker-56f5bf9455-h6wbh\" (UID: \"f837da8f-bc82-4be5-9094-8e73ba2c57a3\") " pod="calico-system/whisker-56f5bf9455-h6wbh" Mar 7 00:56:56.593382 kubelet[3248]: I0307 00:56:56.592684 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f837da8f-bc82-4be5-9094-8e73ba2c57a3-whisker-backend-key-pair\") pod \"whisker-56f5bf9455-h6wbh\" (UID: \"f837da8f-bc82-4be5-9094-8e73ba2c57a3\") " pod="calico-system/whisker-56f5bf9455-h6wbh" Mar 7 00:56:56.593382 kubelet[3248]: I0307 00:56:56.592730 3248 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d74h\" (UniqueName: \"kubernetes.io/projected/f837da8f-bc82-4be5-9094-8e73ba2c57a3-kube-api-access-4d74h\") pod \"whisker-56f5bf9455-h6wbh\" (UID: \"f837da8f-bc82-4be5-9094-8e73ba2c57a3\") " pod="calico-system/whisker-56f5bf9455-h6wbh" Mar 7 00:56:56.621615 systemd[1]: Started cri-containerd-1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877.scope - libcontainer container 1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877. Mar 7 00:56:56.781814 containerd[2014]: time="2026-03-07T00:56:56.779222637Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:56.781814 containerd[2014]: time="2026-03-07T00:56:56.779340669Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:56.781814 containerd[2014]: time="2026-03-07T00:56:56.780550497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:56.791362 containerd[2014]: time="2026-03-07T00:56:56.786932829Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:56.835762 systemd-networkd[1864]: calic752211aee8: Link UP Mar 7 00:56:56.844832 systemd-networkd[1864]: calic752211aee8: Gained carrier Mar 7 00:56:56.861394 containerd[2014]: time="2026-03-07T00:56:56.861317529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56f5bf9455-h6wbh,Uid:f837da8f-bc82-4be5-9094-8e73ba2c57a3,Namespace:calico-system,Attempt:0,}" Mar 7 00:56:56.901738 systemd[1]: Started cri-containerd-ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8.scope - libcontainer container ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8. Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:55.634 [ERROR][4827] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:55.716 [INFO][4827] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0 calico-apiserver-5f7f748cc6- calico-system 2a289909-3bfe-4f98-9e7f-8f97be86ab84 942 0 2026-03-07 00:56:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f7f748cc6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-19-200 calico-apiserver-5f7f748cc6-fd6fv eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calic752211aee8 [] [] }} ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:55.716 [INFO][4827] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.234 [INFO][4975] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" HandleID="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.298 [INFO][4975] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" HandleID="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004640d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"calico-apiserver-5f7f748cc6-fd6fv", "timestamp":"2026-03-07 00:56:56.234580026 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400029a420)} Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.298 [INFO][4975] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.300 [INFO][4975] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.300 [INFO][4975] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.316 [INFO][4975] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.368 [INFO][4975] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.510 [INFO][4975] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.563 [INFO][4975] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.639 [INFO][4975] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.641 [INFO][4975] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.675 [INFO][4975] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8 Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.722 [INFO][4975] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.779 [INFO][4975] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.132/26] block=192.168.31.128/26 handle="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.779 [INFO][4975] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.132/26] handle="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" host="ip-172-31-19-200" Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.780 [INFO][4975] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:56.922235 containerd[2014]: 2026-03-07 00:56:56.782 [INFO][4975] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.132/26] IPv6=[] ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" HandleID="k8s-pod-network.ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.802 [INFO][4827] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"2a289909-3bfe-4f98-9e7f-8f97be86ab84", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"calico-apiserver-5f7f748cc6-fd6fv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic752211aee8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.804 [INFO][4827] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.132/32] ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.805 [INFO][4827] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic752211aee8 ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.871 [INFO][4827] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.872 [INFO][4827] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"2a289909-3bfe-4f98-9e7f-8f97be86ab84", ResourceVersion:"942", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8", Pod:"calico-apiserver-5f7f748cc6-fd6fv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic752211aee8", MAC:"1e:2d:5e:30:79:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:56.924906 containerd[2014]: 2026-03-07 00:56:56.912 [INFO][4827] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-fd6fv" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:56:57.050392 systemd-networkd[1864]: cali05c656191a8: Link UP Mar 7 00:56:57.063569 systemd-networkd[1864]: cali05c656191a8: Gained carrier Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:55.959 [ERROR][4939] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.063 [INFO][4939] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0 calico-apiserver-5f7f748cc6- calico-system 4134feb0-74d5-456d-b147-4d748b45b820 947 0 2026-03-07 00:56:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f7f748cc6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-19-200 calico-apiserver-5f7f748cc6-267tw eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali05c656191a8 [] [] }} ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.064 [INFO][4939] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.530 [INFO][5026] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" HandleID="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.676 [INFO][5026] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" HandleID="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000392cd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"calico-apiserver-5f7f748cc6-267tw", "timestamp":"2026-03-07 00:56:56.530079895 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000e4dc0)} Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.676 [INFO][5026] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.782 [INFO][5026] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.783 [INFO][5026] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.796 [INFO][5026] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.843 [INFO][5026] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.898 [INFO][5026] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.921 [INFO][5026] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.935 [INFO][5026] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.935 [INFO][5026] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.940 [INFO][5026] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809 Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:56.968 [INFO][5026] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:57.007 [INFO][5026] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.133/26] block=192.168.31.128/26 handle="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:57.008 [INFO][5026] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.133/26] handle="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" host="ip-172-31-19-200" Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:57.009 [INFO][5026] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:57.220422 containerd[2014]: 2026-03-07 00:56:57.012 [INFO][5026] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.133/26] IPv6=[] ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" HandleID="k8s-pod-network.f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.034 [INFO][4939] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"4134feb0-74d5-456d-b147-4d748b45b820", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"calico-apiserver-5f7f748cc6-267tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali05c656191a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.041 [INFO][4939] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.133/32] ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.041 [INFO][4939] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali05c656191a8 ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.068 [INFO][4939] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.080 [INFO][4939] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"4134feb0-74d5-456d-b147-4d748b45b820", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809", Pod:"calico-apiserver-5f7f748cc6-267tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali05c656191a8", MAC:"66:e4:a5:95:8e:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.222837 containerd[2014]: 2026-03-07 00:56:57.164 [INFO][4939] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809" Namespace="calico-system" Pod="calico-apiserver-5f7f748cc6-267tw" WorkloadEndpoint="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:56:57.226852 containerd[2014]: time="2026-03-07T00:56:57.226499191Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:57.226852 containerd[2014]: time="2026-03-07T00:56:57.226602175Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:57.229682 containerd[2014]: time="2026-03-07T00:56:57.226639351Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.229682 containerd[2014]: time="2026-03-07T00:56:57.227327527Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.336311 systemd-networkd[1864]: cali5cf34869ee4: Link UP Mar 7 00:56:57.337257 systemd-networkd[1864]: cali5cf34869ee4: Gained carrier Mar 7 00:56:57.366899 containerd[2014]: time="2026-03-07T00:56:57.366018296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6s4mf,Uid:293cb0c8-aea0-4a14-b36c-9f3b77cd23c5,Namespace:kube-system,Attempt:1,} returns sandbox id \"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8\"" Mar 7 00:56:57.375910 containerd[2014]: time="2026-03-07T00:56:57.374678228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69f5fcbdf5-swnh8,Uid:d29570d8-449a-4a48-af9f-3d0d19534b53,Namespace:calico-system,Attempt:1,} returns sandbox id \"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877\"" Mar 7 00:56:57.415931 containerd[2014]: time="2026-03-07T00:56:57.414991772Z" level=info msg="CreateContainer within sandbox \"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:55.780 [ERROR][4893] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:55.918 [INFO][4893] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0 goldmane-5b85766d88- calico-system ab0a4538-d7e9-47a6-8763-f14115b5894f 945 0 2026-03-07 00:56:30 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-19-200 goldmane-5b85766d88-nbqjk eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5cf34869ee4 [] [] }} ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:55.918 [INFO][4893] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:56.572 [INFO][5009] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" HandleID="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:56.679 [INFO][5009] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" HandleID="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000315610), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"goldmane-5b85766d88-nbqjk", "timestamp":"2026-03-07 00:56:56.572298596 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001d8000)} Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:56.679 [INFO][5009] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.009 [INFO][5009] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.009 [INFO][5009] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.021 [INFO][5009] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.049 [INFO][5009] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.088 [INFO][5009] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.098 [INFO][5009] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.127 [INFO][5009] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.134 [INFO][5009] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.152 [INFO][5009] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.202 [INFO][5009] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.232 [INFO][5009] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.134/26] block=192.168.31.128/26 handle="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.232 [INFO][5009] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.134/26] handle="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" host="ip-172-31-19-200" Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.232 [INFO][5009] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:57.440497 containerd[2014]: 2026-03-07 00:56:57.232 [INFO][5009] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.134/26] IPv6=[] ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" HandleID="k8s-pod-network.9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.266 [INFO][4893] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"ab0a4538-d7e9-47a6-8763-f14115b5894f", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"goldmane-5b85766d88-nbqjk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5cf34869ee4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.266 [INFO][4893] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.134/32] ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.266 [INFO][4893] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5cf34869ee4 ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.340 [INFO][4893] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.360 [INFO][4893] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"ab0a4538-d7e9-47a6-8763-f14115b5894f", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e", Pod:"goldmane-5b85766d88-nbqjk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5cf34869ee4", MAC:"2e:56:e9:c6:47:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.447433 containerd[2014]: 2026-03-07 00:56:57.413 [INFO][4893] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e" Namespace="calico-system" Pod="goldmane-5b85766d88-nbqjk" WorkloadEndpoint="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:56:57.498664 systemd[1]: Started cri-containerd-ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8.scope - libcontainer container ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8. Mar 7 00:56:57.522020 systemd-networkd[1864]: calibb72ba20bb8: Link UP Mar 7 00:56:57.553751 systemd-networkd[1864]: calide84ecc1233: Gained IPv6LL Mar 7 00:56:57.555148 systemd-networkd[1864]: calibb72ba20bb8: Gained carrier Mar 7 00:56:57.599565 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3454325878.mount: Deactivated successfully. Mar 7 00:56:57.627054 containerd[2014]: time="2026-03-07T00:56:57.626891205Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:57.628625 containerd[2014]: time="2026-03-07T00:56:57.628367289Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:57.630533 containerd[2014]: time="2026-03-07T00:56:57.629037201Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:55.946 [ERROR][4933] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:56.012 [INFO][4933] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0 coredns-674b8bbfcf- kube-system d7c61432-fbce-407b-bac9-6d0016af3dc6 946 0 2026-03-07 00:56:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-19-200 coredns-674b8bbfcf-cxswj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibb72ba20bb8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:56.017 [INFO][4933] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:56.678 [INFO][5018] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" HandleID="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:56.724 [INFO][5018] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" HandleID="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000373a70), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-19-200", "pod":"coredns-674b8bbfcf-cxswj", "timestamp":"2026-03-07 00:56:56.678792728 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000186840)} Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:56.724 [INFO][5018] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.236 [INFO][5018] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.236 [INFO][5018] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.256 [INFO][5018] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.289 [INFO][5018] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.333 [INFO][5018] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.380 [INFO][5018] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.394 [INFO][5018] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.399 [INFO][5018] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.415 [INFO][5018] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5 Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.440 [INFO][5018] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.469 [INFO][5018] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.135/26] block=192.168.31.128/26 handle="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.469 [INFO][5018] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.135/26] handle="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" host="ip-172-31-19-200" Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.469 [INFO][5018] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:57.644892 containerd[2014]: 2026-03-07 00:56:57.469 [INFO][5018] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.135/26] IPv6=[] ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" HandleID="k8s-pod-network.f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.499 [INFO][4933] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d7c61432-fbce-407b-bac9-6d0016af3dc6", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"coredns-674b8bbfcf-cxswj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb72ba20bb8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.499 [INFO][4933] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.135/32] ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.500 [INFO][4933] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibb72ba20bb8 ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.561 [INFO][4933] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.587 [INFO][4933] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d7c61432-fbce-407b-bac9-6d0016af3dc6", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5", Pod:"coredns-674b8bbfcf-cxswj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb72ba20bb8", MAC:"82:12:59:01:b1:a8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:57.653563 containerd[2014]: 2026-03-07 00:56:57.615 [INFO][4933] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5" Namespace="kube-system" Pod="coredns-674b8bbfcf-cxswj" WorkloadEndpoint="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:56:57.653563 containerd[2014]: time="2026-03-07T00:56:57.648963081Z" level=info msg="CreateContainer within sandbox \"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"32351a1961e23f52d41ae02b6a82e3667be485f801628b9e7d8746ebd59a54b3\"" Mar 7 00:56:57.653563 containerd[2014]: time="2026-03-07T00:56:57.650578161Z" level=info msg="StartContainer for \"32351a1961e23f52d41ae02b6a82e3667be485f801628b9e7d8746ebd59a54b3\"" Mar 7 00:56:57.657414 containerd[2014]: time="2026-03-07T00:56:57.656523189Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.723409 containerd[2014]: time="2026-03-07T00:56:57.716940729Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:57.723409 containerd[2014]: time="2026-03-07T00:56:57.717992097Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:57.723409 containerd[2014]: time="2026-03-07T00:56:57.718041681Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.727827 containerd[2014]: time="2026-03-07T00:56:57.721478901Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:57.808085 kubelet[3248]: I0307 00:56:57.806690 3248 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f36088b-c8bd-4e60-bd21-33bf72d4d138" path="/var/lib/kubelet/pods/2f36088b-c8bd-4e60-bd21-33bf72d4d138/volumes" Mar 7 00:56:57.873549 systemd[1]: Started cri-containerd-9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e.scope - libcontainer container 9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e. Mar 7 00:56:57.883120 systemd[1]: Started cri-containerd-f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809.scope - libcontainer container f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809. Mar 7 00:56:57.946907 systemd[1]: Started cri-containerd-32351a1961e23f52d41ae02b6a82e3667be485f801628b9e7d8746ebd59a54b3.scope - libcontainer container 32351a1961e23f52d41ae02b6a82e3667be485f801628b9e7d8746ebd59a54b3. Mar 7 00:56:57.998461 containerd[2014]: time="2026-03-07T00:56:57.997958495Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:57.998461 containerd[2014]: time="2026-03-07T00:56:57.998117855Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:58.000339 containerd[2014]: time="2026-03-07T00:56:57.999419531Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:58.003095 containerd[2014]: time="2026-03-07T00:56:58.000145555Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:58.093737 systemd[1]: Started cri-containerd-f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5.scope - libcontainer container f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5. Mar 7 00:56:58.127384 containerd[2014]: time="2026-03-07T00:56:58.126014263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-fd6fv,Uid:2a289909-3bfe-4f98-9e7f-8f97be86ab84,Namespace:calico-system,Attempt:1,} returns sandbox id \"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8\"" Mar 7 00:56:58.177169 systemd-networkd[1864]: calib7c31188683: Link UP Mar 7 00:56:58.180792 systemd-networkd[1864]: calib7c31188683: Gained carrier Mar 7 00:56:58.272078 containerd[2014]: time="2026-03-07T00:56:58.271736888Z" level=info msg="StartContainer for \"32351a1961e23f52d41ae02b6a82e3667be485f801628b9e7d8746ebd59a54b3\" returns successfully" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.244 [ERROR][5134] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.379 [INFO][5134] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0 whisker-56f5bf9455- calico-system f837da8f-bc82-4be5-9094-8e73ba2c57a3 976 0 2026-03-07 00:56:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:56f5bf9455 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-19-200 whisker-56f5bf9455-h6wbh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib7c31188683 [] [] }} ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.385 [INFO][5134] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.824 [INFO][5206] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" HandleID="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Workload="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.951 [INFO][5206] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" HandleID="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Workload="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000467010), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-19-200", "pod":"whisker-56f5bf9455-h6wbh", "timestamp":"2026-03-07 00:56:57.824966806 +0000 UTC"}, Hostname:"ip-172-31-19-200", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002ae420)} Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.951 [INFO][5206] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.951 [INFO][5206] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.952 [INFO][5206] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-19-200' Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.963 [INFO][5206] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:57.984 [INFO][5206] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.019 [INFO][5206] ipam/ipam.go 526: Trying affinity for 192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.028 [INFO][5206] ipam/ipam.go 160: Attempting to load block cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.037 [INFO][5206] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.31.128/26 host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.037 [INFO][5206] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.31.128/26 handle="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.041 [INFO][5206] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8 Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.054 [INFO][5206] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.31.128/26 handle="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.098 [INFO][5206] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.31.136/26] block=192.168.31.128/26 handle="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.098 [INFO][5206] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.31.136/26] handle="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" host="ip-172-31-19-200" Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.099 [INFO][5206] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:56:58.295804 containerd[2014]: 2026-03-07 00:56:58.100 [INFO][5206] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.31.136/26] IPv6=[] ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" HandleID="k8s-pod-network.7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Workload="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.135 [INFO][5134] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0", GenerateName:"whisker-56f5bf9455-", Namespace:"calico-system", SelfLink:"", UID:"f837da8f-bc82-4be5-9094-8e73ba2c57a3", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"56f5bf9455", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"", Pod:"whisker-56f5bf9455-h6wbh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.31.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib7c31188683", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.135 [INFO][5134] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.31.136/32] ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.145 [INFO][5134] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib7c31188683 ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.186 [INFO][5134] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.209 [INFO][5134] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0", GenerateName:"whisker-56f5bf9455-", Namespace:"calico-system", SelfLink:"", UID:"f837da8f-bc82-4be5-9094-8e73ba2c57a3", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"56f5bf9455", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8", Pod:"whisker-56f5bf9455-h6wbh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.31.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib7c31188683", MAC:"ea:d4:3c:6c:87:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:56:58.299784 containerd[2014]: 2026-03-07 00:56:58.257 [INFO][5134] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8" Namespace="calico-system" Pod="whisker-56f5bf9455-h6wbh" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--56f5bf9455--h6wbh-eth0" Mar 7 00:56:58.315128 systemd-networkd[1864]: calic6cc9e85ad1: Gained IPv6LL Mar 7 00:56:58.396971 containerd[2014]: time="2026-03-07T00:56:58.396567705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cxswj,Uid:d7c61432-fbce-407b-bac9-6d0016af3dc6,Namespace:kube-system,Attempt:1,} returns sandbox id \"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5\"" Mar 7 00:56:58.410712 containerd[2014]: time="2026-03-07T00:56:58.410215341Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:56:58.417240 kubelet[3248]: I0307 00:56:58.415649 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-6s4mf" podStartSLOduration=47.415621689 podStartE2EDuration="47.415621689s" podCreationTimestamp="2026-03-07 00:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:58.405173349 +0000 UTC m=+50.923199510" watchObservedRunningTime="2026-03-07 00:56:58.415621689 +0000 UTC m=+50.933647802" Mar 7 00:56:58.417426 containerd[2014]: time="2026-03-07T00:56:58.410523369Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:56:58.417426 containerd[2014]: time="2026-03-07T00:56:58.414752889Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:58.417426 containerd[2014]: time="2026-03-07T00:56:58.415061061Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:56:58.431158 containerd[2014]: time="2026-03-07T00:56:58.431068605Z" level=info msg="CreateContainer within sandbox \"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:56:58.442919 systemd-networkd[1864]: calic752211aee8: Gained IPv6LL Mar 7 00:56:58.494776 containerd[2014]: time="2026-03-07T00:56:58.494159745Z" level=info msg="CreateContainer within sandbox \"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8d68cfc373b2fe1729e00e65e2704ea255bf1a4da17ec325f5079416ffd8e4ad\"" Mar 7 00:56:58.495604 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2421716907.mount: Deactivated successfully. Mar 7 00:56:58.497111 containerd[2014]: time="2026-03-07T00:56:58.497029953Z" level=info msg="StartContainer for \"8d68cfc373b2fe1729e00e65e2704ea255bf1a4da17ec325f5079416ffd8e4ad\"" Mar 7 00:56:58.622731 systemd[1]: Started cri-containerd-7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8.scope - libcontainer container 7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8. Mar 7 00:56:58.695636 systemd[1]: Started cri-containerd-8d68cfc373b2fe1729e00e65e2704ea255bf1a4da17ec325f5079416ffd8e4ad.scope - libcontainer container 8d68cfc373b2fe1729e00e65e2704ea255bf1a4da17ec325f5079416ffd8e4ad. Mar 7 00:56:58.710849 containerd[2014]: time="2026-03-07T00:56:58.708003370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f7f748cc6-267tw,Uid:4134feb0-74d5-456d-b147-4d748b45b820,Namespace:calico-system,Attempt:1,} returns sandbox id \"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809\"" Mar 7 00:56:58.764132 systemd-networkd[1864]: cali05c656191a8: Gained IPv6LL Mar 7 00:56:58.865699 containerd[2014]: time="2026-03-07T00:56:58.865038239Z" level=info msg="StartContainer for \"8d68cfc373b2fe1729e00e65e2704ea255bf1a4da17ec325f5079416ffd8e4ad\" returns successfully" Mar 7 00:56:58.894226 containerd[2014]: time="2026-03-07T00:56:58.892312991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nbqjk,Uid:ab0a4538-d7e9-47a6-8763-f14115b5894f,Namespace:calico-system,Attempt:1,} returns sandbox id \"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e\"" Mar 7 00:56:59.004976 containerd[2014]: time="2026-03-07T00:56:59.004836992Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:59.009306 containerd[2014]: time="2026-03-07T00:56:59.009250592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 7 00:56:59.012338 containerd[2014]: time="2026-03-07T00:56:59.012278072Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:59.022400 containerd[2014]: time="2026-03-07T00:56:59.021804056Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:56:59.024093 containerd[2014]: time="2026-03-07T00:56:59.024034856Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 3.581391486s" Mar 7 00:56:59.024341 containerd[2014]: time="2026-03-07T00:56:59.024308900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 7 00:56:59.031073 containerd[2014]: time="2026-03-07T00:56:59.030502232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 00:56:59.038327 containerd[2014]: time="2026-03-07T00:56:59.038256560Z" level=info msg="CreateContainer within sandbox \"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 00:56:59.125237 containerd[2014]: time="2026-03-07T00:56:59.123561476Z" level=info msg="CreateContainer within sandbox \"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a\"" Mar 7 00:56:59.135904 containerd[2014]: time="2026-03-07T00:56:59.134500928Z" level=info msg="StartContainer for \"97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a\"" Mar 7 00:56:59.149524 systemd-networkd[1864]: cali5cf34869ee4: Gained IPv6LL Mar 7 00:56:59.170857 containerd[2014]: time="2026-03-07T00:56:59.170783384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56f5bf9455-h6wbh,Uid:f837da8f-bc82-4be5-9094-8e73ba2c57a3,Namespace:calico-system,Attempt:0,} returns sandbox id \"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8\"" Mar 7 00:56:59.199737 kubelet[3248]: E0307 00:56:59.199666 3248 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf837da8f_bc82_4be5_9094_8e73ba2c57a3.slice/cri-containerd-7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8.scope\": RecentStats: unable to find data in memory cache]" Mar 7 00:56:59.212034 systemd-networkd[1864]: calib7c31188683: Gained IPv6LL Mar 7 00:56:59.311738 systemd[1]: run-containerd-runc-k8s.io-97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a-runc.HM3Oa6.mount: Deactivated successfully. Mar 7 00:56:59.326540 systemd[1]: Started cri-containerd-97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a.scope - libcontainer container 97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a. Mar 7 00:56:59.338487 systemd-networkd[1864]: calibb72ba20bb8: Gained IPv6LL Mar 7 00:56:59.458822 containerd[2014]: time="2026-03-07T00:56:59.458491870Z" level=info msg="StartContainer for \"97f6c1eed083fa2cc9141c249742a3acde2e39fe8697e78a50d08cc01c28575a\" returns successfully" Mar 7 00:56:59.494725 kubelet[3248]: I0307 00:56:59.494593 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-cxswj" podStartSLOduration=48.49453915 podStartE2EDuration="48.49453915s" podCreationTimestamp="2026-03-07 00:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:56:59.45772849 +0000 UTC m=+51.975754639" watchObservedRunningTime="2026-03-07 00:56:59.49453915 +0000 UTC m=+52.012565287" Mar 7 00:56:59.655238 kernel: calico-node[5511]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 00:57:00.399802 systemd-networkd[1864]: vxlan.calico: Link UP Mar 7 00:57:00.399822 systemd-networkd[1864]: vxlan.calico: Gained carrier Mar 7 00:57:01.451034 systemd-networkd[1864]: vxlan.calico: Gained IPv6LL Mar 7 00:57:02.333764 systemd[1]: Started sshd@7-172.31.19.200:22-20.161.92.111:45512.service - OpenSSH per-connection server daemon (20.161.92.111:45512). Mar 7 00:57:02.804726 kubelet[3248]: I0307 00:57:02.804682 3248 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:57:02.906388 sshd[5663]: Accepted publickey for core from 20.161.92.111 port 45512 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:02.911520 sshd[5663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:02.930310 systemd-logind[2003]: New session 8 of user core. Mar 7 00:57:02.936656 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 00:57:02.968223 containerd[2014]: time="2026-03-07T00:57:02.965028939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:02.968784 containerd[2014]: time="2026-03-07T00:57:02.968389023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 7 00:57:02.969690 containerd[2014]: time="2026-03-07T00:57:02.969635523Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:02.976660 containerd[2014]: time="2026-03-07T00:57:02.976564023Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:02.978945 containerd[2014]: time="2026-03-07T00:57:02.978884439Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.948316123s" Mar 7 00:57:02.979156 containerd[2014]: time="2026-03-07T00:57:02.979125027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 7 00:57:02.984294 containerd[2014]: time="2026-03-07T00:57:02.984238791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:57:03.036657 containerd[2014]: time="2026-03-07T00:57:03.036557376Z" level=info msg="CreateContainer within sandbox \"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 00:57:03.085829 containerd[2014]: time="2026-03-07T00:57:03.085652784Z" level=info msg="CreateContainer within sandbox \"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b\"" Mar 7 00:57:03.094231 containerd[2014]: time="2026-03-07T00:57:03.091458468Z" level=info msg="StartContainer for \"09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b\"" Mar 7 00:57:03.255057 systemd[1]: Started cri-containerd-09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b.scope - libcontainer container 09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b. Mar 7 00:57:03.468868 containerd[2014]: time="2026-03-07T00:57:03.468664646Z" level=info msg="StartContainer for \"09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b\" returns successfully" Mar 7 00:57:03.518631 sshd[5663]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:03.529614 systemd[1]: sshd@7-172.31.19.200:22-20.161.92.111:45512.service: Deactivated successfully. Mar 7 00:57:03.542983 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 00:57:03.550089 systemd-logind[2003]: Session 8 logged out. Waiting for processes to exit. Mar 7 00:57:03.553578 systemd-logind[2003]: Removed session 8. Mar 7 00:57:03.999283 ntpd[1996]: Listen normally on 8 vxlan.calico 192.168.31.128:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 8 vxlan.calico 192.168.31.128:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 9 cali4abbca1f96f [fe80::ecee:eeff:feee:eeee%4]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 10 calide84ecc1233 [fe80::ecee:eeff:feee:eeee%5]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 11 calic6cc9e85ad1 [fe80::ecee:eeff:feee:eeee%6]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 12 calic752211aee8 [fe80::ecee:eeff:feee:eeee%7]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 13 cali05c656191a8 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 14 cali5cf34869ee4 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 15 calibb72ba20bb8 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 16 calib7c31188683 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 7 00:57:04.000846 ntpd[1996]: 7 Mar 00:57:03 ntpd[1996]: Listen normally on 17 vxlan.calico [fe80::6441:fff:fe5b:5476%12]:123 Mar 7 00:57:03.999444 ntpd[1996]: Listen normally on 9 cali4abbca1f96f [fe80::ecee:eeff:feee:eeee%4]:123 Mar 7 00:57:03.999535 ntpd[1996]: Listen normally on 10 calide84ecc1233 [fe80::ecee:eeff:feee:eeee%5]:123 Mar 7 00:57:03.999607 ntpd[1996]: Listen normally on 11 calic6cc9e85ad1 [fe80::ecee:eeff:feee:eeee%6]:123 Mar 7 00:57:03.999676 ntpd[1996]: Listen normally on 12 calic752211aee8 [fe80::ecee:eeff:feee:eeee%7]:123 Mar 7 00:57:03.999745 ntpd[1996]: Listen normally on 13 cali05c656191a8 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 7 00:57:03.999815 ntpd[1996]: Listen normally on 14 cali5cf34869ee4 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 7 00:57:03.999891 ntpd[1996]: Listen normally on 15 calibb72ba20bb8 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 7 00:57:03.999959 ntpd[1996]: Listen normally on 16 calib7c31188683 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 7 00:57:04.000030 ntpd[1996]: Listen normally on 17 vxlan.calico [fe80::6441:fff:fe5b:5476%12]:123 Mar 7 00:57:04.532343 kubelet[3248]: I0307 00:57:04.531564 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-69f5fcbdf5-swnh8" podStartSLOduration=24.93658182 podStartE2EDuration="30.531536343s" podCreationTimestamp="2026-03-07 00:56:34 +0000 UTC" firstStartedPulling="2026-03-07 00:56:57.387998048 +0000 UTC m=+49.906024161" lastFinishedPulling="2026-03-07 00:57:02.982952559 +0000 UTC m=+55.500978684" observedRunningTime="2026-03-07 00:57:04.529498071 +0000 UTC m=+57.047524208" watchObservedRunningTime="2026-03-07 00:57:04.531536343 +0000 UTC m=+57.049562480" Mar 7 00:57:07.454469 containerd[2014]: time="2026-03-07T00:57:07.453589842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:07.457171 containerd[2014]: time="2026-03-07T00:57:07.456857526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 7 00:57:07.459317 containerd[2014]: time="2026-03-07T00:57:07.459245634Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:07.465789 containerd[2014]: time="2026-03-07T00:57:07.465688554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:07.467602 containerd[2014]: time="2026-03-07T00:57:07.467507142Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 4.482774383s" Mar 7 00:57:07.467602 containerd[2014]: time="2026-03-07T00:57:07.467573382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:57:07.472460 containerd[2014]: time="2026-03-07T00:57:07.471215670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:57:07.478066 containerd[2014]: time="2026-03-07T00:57:07.478009578Z" level=info msg="CreateContainer within sandbox \"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:57:07.515133 containerd[2014]: time="2026-03-07T00:57:07.514974210Z" level=info msg="CreateContainer within sandbox \"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0f3425c71507eff0470ddd5e8947966209974066236779dba497cb81ae5dc529\"" Mar 7 00:57:07.517331 containerd[2014]: time="2026-03-07T00:57:07.516525150Z" level=info msg="StartContainer for \"0f3425c71507eff0470ddd5e8947966209974066236779dba497cb81ae5dc529\"" Mar 7 00:57:07.584921 systemd[1]: Started cri-containerd-0f3425c71507eff0470ddd5e8947966209974066236779dba497cb81ae5dc529.scope - libcontainer container 0f3425c71507eff0470ddd5e8947966209974066236779dba497cb81ae5dc529. Mar 7 00:57:07.683010 containerd[2014]: time="2026-03-07T00:57:07.682815763Z" level=info msg="StartContainer for \"0f3425c71507eff0470ddd5e8947966209974066236779dba497cb81ae5dc529\" returns successfully" Mar 7 00:57:07.712491 containerd[2014]: time="2026-03-07T00:57:07.712253359Z" level=info msg="StopPodSandbox for \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\"" Mar 7 00:57:07.836432 containerd[2014]: time="2026-03-07T00:57:07.836346896Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:07.841816 containerd[2014]: time="2026-03-07T00:57:07.841708928Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 00:57:07.865861 containerd[2014]: time="2026-03-07T00:57:07.864110612Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 392.830166ms" Mar 7 00:57:07.865861 containerd[2014]: time="2026-03-07T00:57:07.864279416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:57:07.872080 containerd[2014]: time="2026-03-07T00:57:07.871894628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 00:57:07.890904 containerd[2014]: time="2026-03-07T00:57:07.890258444Z" level=info msg="CreateContainer within sandbox \"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:57:07.981790 containerd[2014]: time="2026-03-07T00:57:07.981483968Z" level=info msg="CreateContainer within sandbox \"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"30d753094258a7f114879d97a0490ad1fe20228373b316b8ac160cb491b425c2\"" Mar 7 00:57:07.984933 containerd[2014]: time="2026-03-07T00:57:07.983815568Z" level=info msg="StartContainer for \"30d753094258a7f114879d97a0490ad1fe20228373b316b8ac160cb491b425c2\"" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.825 [WARNING][5859] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0", GenerateName:"calico-kube-controllers-69f5fcbdf5-", Namespace:"calico-system", SelfLink:"", UID:"d29570d8-449a-4a48-af9f-3d0d19534b53", ResourceVersion:"1089", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69f5fcbdf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877", Pod:"calico-kube-controllers-69f5fcbdf5-swnh8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide84ecc1233", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.825 [INFO][5859] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.825 [INFO][5859] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" iface="eth0" netns="" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.826 [INFO][5859] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.826 [INFO][5859] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.993 [INFO][5868] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.994 [INFO][5868] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:07.996 [INFO][5868] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:08.022 [WARNING][5868] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:08.022 [INFO][5868] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:08.026 [INFO][5868] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:08.042589 containerd[2014]: 2026-03-07 00:57:08.034 [INFO][5859] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.044763 containerd[2014]: time="2026-03-07T00:57:08.042638897Z" level=info msg="TearDown network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" successfully" Mar 7 00:57:08.044763 containerd[2014]: time="2026-03-07T00:57:08.042676961Z" level=info msg="StopPodSandbox for \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" returns successfully" Mar 7 00:57:08.044763 containerd[2014]: time="2026-03-07T00:57:08.043843253Z" level=info msg="RemovePodSandbox for \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\"" Mar 7 00:57:08.044763 containerd[2014]: time="2026-03-07T00:57:08.043901453Z" level=info msg="Forcibly stopping sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\"" Mar 7 00:57:08.076733 systemd[1]: Started cri-containerd-30d753094258a7f114879d97a0490ad1fe20228373b316b8ac160cb491b425c2.scope - libcontainer container 30d753094258a7f114879d97a0490ad1fe20228373b316b8ac160cb491b425c2. Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.155 [WARNING][5904] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0", GenerateName:"calico-kube-controllers-69f5fcbdf5-", Namespace:"calico-system", SelfLink:"", UID:"d29570d8-449a-4a48-af9f-3d0d19534b53", ResourceVersion:"1089", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69f5fcbdf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"1458a1a784c3484e0375f8288d750ae8a605ec07ce7f659a7ba761e291fd6877", Pod:"calico-kube-controllers-69f5fcbdf5-swnh8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.31.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide84ecc1233", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.155 [INFO][5904] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.155 [INFO][5904] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" iface="eth0" netns="" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.156 [INFO][5904] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.156 [INFO][5904] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.203 [INFO][5919] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.203 [INFO][5919] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.203 [INFO][5919] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.223 [WARNING][5919] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.223 [INFO][5919] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" HandleID="k8s-pod-network.0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Workload="ip--172--31--19--200-k8s-calico--kube--controllers--69f5fcbdf5--swnh8-eth0" Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.232 [INFO][5919] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:08.246347 containerd[2014]: 2026-03-07 00:57:08.237 [INFO][5904] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0" Mar 7 00:57:08.246347 containerd[2014]: time="2026-03-07T00:57:08.245869050Z" level=info msg="TearDown network for sandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" successfully" Mar 7 00:57:08.265485 containerd[2014]: time="2026-03-07T00:57:08.265412274Z" level=info msg="StartContainer for \"30d753094258a7f114879d97a0490ad1fe20228373b316b8ac160cb491b425c2\" returns successfully" Mar 7 00:57:08.266266 containerd[2014]: time="2026-03-07T00:57:08.266151642Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:08.266429 containerd[2014]: time="2026-03-07T00:57:08.266294262Z" level=info msg="RemovePodSandbox \"0d367a5e9ed25ae2bac584e97d51d80548e9b77cccff3957eda6ef1baf5dd9e0\" returns successfully" Mar 7 00:57:08.267984 containerd[2014]: time="2026-03-07T00:57:08.267891714Z" level=info msg="StopPodSandbox for \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\"" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.393 [WARNING][5943] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"2a289909-3bfe-4f98-9e7f-8f97be86ab84", ResourceVersion:"981", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8", Pod:"calico-apiserver-5f7f748cc6-fd6fv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic752211aee8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.393 [INFO][5943] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.393 [INFO][5943] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" iface="eth0" netns="" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.393 [INFO][5943] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.393 [INFO][5943] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.464 [INFO][5952] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.464 [INFO][5952] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.464 [INFO][5952] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.481 [WARNING][5952] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.481 [INFO][5952] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.484 [INFO][5952] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:08.496330 containerd[2014]: 2026-03-07 00:57:08.488 [INFO][5943] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.502534 containerd[2014]: time="2026-03-07T00:57:08.500289631Z" level=info msg="TearDown network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" successfully" Mar 7 00:57:08.502534 containerd[2014]: time="2026-03-07T00:57:08.502249699Z" level=info msg="StopPodSandbox for \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" returns successfully" Mar 7 00:57:08.502993 containerd[2014]: time="2026-03-07T00:57:08.502745743Z" level=info msg="RemovePodSandbox for \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\"" Mar 7 00:57:08.502993 containerd[2014]: time="2026-03-07T00:57:08.502791703Z" level=info msg="Forcibly stopping sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\"" Mar 7 00:57:08.518586 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount389369467.mount: Deactivated successfully. Mar 7 00:57:08.594872 kubelet[3248]: I0307 00:57:08.594659 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5f7f748cc6-267tw" podStartSLOduration=30.439090449 podStartE2EDuration="39.592526755s" podCreationTimestamp="2026-03-07 00:56:29 +0000 UTC" firstStartedPulling="2026-03-07 00:56:58.71472607 +0000 UTC m=+51.232752195" lastFinishedPulling="2026-03-07 00:57:07.868162376 +0000 UTC m=+60.386188501" observedRunningTime="2026-03-07 00:57:08.591990091 +0000 UTC m=+61.110016228" watchObservedRunningTime="2026-03-07 00:57:08.592526755 +0000 UTC m=+61.110552880" Mar 7 00:57:08.596792 kubelet[3248]: I0307 00:57:08.595284 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5f7f748cc6-fd6fv" podStartSLOduration=30.320600649 podStartE2EDuration="39.595264939s" podCreationTimestamp="2026-03-07 00:56:29 +0000 UTC" firstStartedPulling="2026-03-07 00:56:58.195169556 +0000 UTC m=+50.713195681" lastFinishedPulling="2026-03-07 00:57:07.469833858 +0000 UTC m=+59.987859971" observedRunningTime="2026-03-07 00:57:08.559122463 +0000 UTC m=+61.077148612" watchObservedRunningTime="2026-03-07 00:57:08.595264939 +0000 UTC m=+61.113291064" Mar 7 00:57:08.629308 systemd[1]: Started sshd@8-172.31.19.200:22-20.161.92.111:45520.service - OpenSSH per-connection server daemon (20.161.92.111:45520). Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.680 [WARNING][5972] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"2a289909-3bfe-4f98-9e7f-8f97be86ab84", ResourceVersion:"1117", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ecc1ea0af1bb39e9875bd2393a1d3f38a58c131ad7e96c4e8c9c48e6ef35e9c8", Pod:"calico-apiserver-5f7f748cc6-fd6fv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calic752211aee8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.680 [INFO][5972] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.680 [INFO][5972] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" iface="eth0" netns="" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.680 [INFO][5972] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.680 [INFO][5972] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.792 [INFO][5983] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.792 [INFO][5983] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.792 [INFO][5983] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.813 [WARNING][5983] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.813 [INFO][5983] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" HandleID="k8s-pod-network.10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--fd6fv-eth0" Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.817 [INFO][5983] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:08.832925 containerd[2014]: 2026-03-07 00:57:08.825 [INFO][5972] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a" Mar 7 00:57:08.832925 containerd[2014]: time="2026-03-07T00:57:08.831523304Z" level=info msg="TearDown network for sandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" successfully" Mar 7 00:57:08.851236 containerd[2014]: time="2026-03-07T00:57:08.850992573Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:08.851236 containerd[2014]: time="2026-03-07T00:57:08.851110509Z" level=info msg="RemovePodSandbox \"10f54c1f93642d2efec1b0b3a15c8e34218999449ac3815f42826dadd294c51a\" returns successfully" Mar 7 00:57:08.854823 containerd[2014]: time="2026-03-07T00:57:08.854286681Z" level=info msg="StopPodSandbox for \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\"" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:08.944 [WARNING][6000] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:08.945 [INFO][6000] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:08.945 [INFO][6000] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" iface="eth0" netns="" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:08.945 [INFO][6000] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:08.945 [INFO][6000] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.056 [INFO][6008] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.056 [INFO][6008] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.056 [INFO][6008] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.077 [WARNING][6008] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.078 [INFO][6008] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.082 [INFO][6008] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:09.096166 containerd[2014]: 2026-03-07 00:57:09.090 [INFO][6000] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.100150 containerd[2014]: time="2026-03-07T00:57:09.097678350Z" level=info msg="TearDown network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" successfully" Mar 7 00:57:09.100150 containerd[2014]: time="2026-03-07T00:57:09.097723086Z" level=info msg="StopPodSandbox for \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" returns successfully" Mar 7 00:57:09.102335 containerd[2014]: time="2026-03-07T00:57:09.100635414Z" level=info msg="RemovePodSandbox for \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\"" Mar 7 00:57:09.102335 containerd[2014]: time="2026-03-07T00:57:09.100687566Z" level=info msg="Forcibly stopping sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\"" Mar 7 00:57:09.192231 sshd[5978]: Accepted publickey for core from 20.161.92.111 port 45520 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:09.195687 sshd[5978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:09.231717 systemd-logind[2003]: New session 9 of user core. Mar 7 00:57:09.237525 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.353 [WARNING][6022] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" WorkloadEndpoint="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.353 [INFO][6022] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.353 [INFO][6022] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" iface="eth0" netns="" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.353 [INFO][6022] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.353 [INFO][6022] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.601 [INFO][6031] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.629 [INFO][6031] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.629 [INFO][6031] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.677 [WARNING][6031] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.677 [INFO][6031] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" HandleID="k8s-pod-network.aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Workload="ip--172--31--19--200-k8s-whisker--6b786f9cf8--kdqrt-eth0" Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.682 [INFO][6031] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:09.727558 containerd[2014]: 2026-03-07 00:57:09.705 [INFO][6022] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5" Mar 7 00:57:09.729538 containerd[2014]: time="2026-03-07T00:57:09.729480849Z" level=info msg="TearDown network for sandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" successfully" Mar 7 00:57:09.750743 containerd[2014]: time="2026-03-07T00:57:09.750670869Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:09.751213 containerd[2014]: time="2026-03-07T00:57:09.750988401Z" level=info msg="RemovePodSandbox \"aee83733f88074c4ebcd935135cf35a790a812459e42f1814ee39d8f233167b5\" returns successfully" Mar 7 00:57:09.755103 containerd[2014]: time="2026-03-07T00:57:09.754600065Z" level=info msg="StopPodSandbox for \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\"" Mar 7 00:57:09.906557 sshd[5978]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:09.921025 systemd[1]: sshd@8-172.31.19.200:22-20.161.92.111:45520.service: Deactivated successfully. Mar 7 00:57:09.921926 systemd-logind[2003]: Session 9 logged out. Waiting for processes to exit. Mar 7 00:57:09.933574 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 00:57:09.941947 systemd-logind[2003]: Removed session 9. Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:09.922 [WARNING][6055] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8", Pod:"coredns-674b8bbfcf-6s4mf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic6cc9e85ad1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:09.926 [INFO][6055] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:09.927 [INFO][6055] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" iface="eth0" netns="" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:09.927 [INFO][6055] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:09.928 [INFO][6055] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.056 [INFO][6064] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.058 [INFO][6064] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.058 [INFO][6064] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.077 [WARNING][6064] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.077 [INFO][6064] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.082 [INFO][6064] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:10.100348 containerd[2014]: 2026-03-07 00:57:10.089 [INFO][6055] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.100348 containerd[2014]: time="2026-03-07T00:57:10.100066975Z" level=info msg="TearDown network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" successfully" Mar 7 00:57:10.100348 containerd[2014]: time="2026-03-07T00:57:10.100103335Z" level=info msg="StopPodSandbox for \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" returns successfully" Mar 7 00:57:10.103426 containerd[2014]: time="2026-03-07T00:57:10.102813931Z" level=info msg="RemovePodSandbox for \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\"" Mar 7 00:57:10.103426 containerd[2014]: time="2026-03-07T00:57:10.102868039Z" level=info msg="Forcibly stopping sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\"" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.298 [WARNING][6082] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293cb0c8-aea0-4a14-b36c-9f3b77cd23c5", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"ee6a47fe9342fdc28a738fe4895adfe9a0edf7bc4c2b06b5561158b8a25a93e8", Pod:"coredns-674b8bbfcf-6s4mf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic6cc9e85ad1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.301 [INFO][6082] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.301 [INFO][6082] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" iface="eth0" netns="" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.301 [INFO][6082] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.301 [INFO][6082] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.441 [INFO][6089] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.443 [INFO][6089] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.443 [INFO][6089] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.484 [WARNING][6089] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.484 [INFO][6089] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" HandleID="k8s-pod-network.e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--6s4mf-eth0" Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.491 [INFO][6089] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:10.510657 containerd[2014]: 2026-03-07 00:57:10.502 [INFO][6082] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281" Mar 7 00:57:10.512871 containerd[2014]: time="2026-03-07T00:57:10.512816553Z" level=info msg="TearDown network for sandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" successfully" Mar 7 00:57:10.520985 containerd[2014]: time="2026-03-07T00:57:10.520880001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:10.521298 containerd[2014]: time="2026-03-07T00:57:10.521257785Z" level=info msg="RemovePodSandbox \"e0cd687b9e4460435d4657ed6d72c2efc26f1f9a71fffa602b30c83f408cf281\" returns successfully" Mar 7 00:57:10.524015 containerd[2014]: time="2026-03-07T00:57:10.523835961Z" level=info msg="StopPodSandbox for \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\"" Mar 7 00:57:10.600624 kubelet[3248]: I0307 00:57:10.600336 3248 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:57:10.601281 kubelet[3248]: I0307 00:57:10.600531 3248 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.728 [WARNING][6105] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d7c61432-fbce-407b-bac9-6d0016af3dc6", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5", Pod:"coredns-674b8bbfcf-cxswj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb72ba20bb8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.728 [INFO][6105] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.728 [INFO][6105] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" iface="eth0" netns="" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.728 [INFO][6105] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.728 [INFO][6105] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.845 [INFO][6112] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.848 [INFO][6112] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.851 [INFO][6112] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.879 [WARNING][6112] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.880 [INFO][6112] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.884 [INFO][6112] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:10.897948 containerd[2014]: 2026-03-07 00:57:10.892 [INFO][6105] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:10.897948 containerd[2014]: time="2026-03-07T00:57:10.897697559Z" level=info msg="TearDown network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" successfully" Mar 7 00:57:10.897948 containerd[2014]: time="2026-03-07T00:57:10.897733487Z" level=info msg="StopPodSandbox for \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" returns successfully" Mar 7 00:57:10.904783 containerd[2014]: time="2026-03-07T00:57:10.899688359Z" level=info msg="RemovePodSandbox for \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\"" Mar 7 00:57:10.904783 containerd[2014]: time="2026-03-07T00:57:10.899737979Z" level=info msg="Forcibly stopping sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\"" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.108 [WARNING][6127] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d7c61432-fbce-407b-bac9-6d0016af3dc6", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f7c14da46d31b47dbeef95f4babfa6cd369c25a67d79bce2ac8a09fdd21924f5", Pod:"coredns-674b8bbfcf-cxswj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.31.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibb72ba20bb8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.108 [INFO][6127] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.108 [INFO][6127] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" iface="eth0" netns="" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.108 [INFO][6127] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.109 [INFO][6127] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.286 [INFO][6134] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.287 [INFO][6134] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.287 [INFO][6134] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.328 [WARNING][6134] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.328 [INFO][6134] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" HandleID="k8s-pod-network.2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Workload="ip--172--31--19--200-k8s-coredns--674b8bbfcf--cxswj-eth0" Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.335 [INFO][6134] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:11.362323 containerd[2014]: 2026-03-07 00:57:11.349 [INFO][6127] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597" Mar 7 00:57:11.362323 containerd[2014]: time="2026-03-07T00:57:11.362241885Z" level=info msg="TearDown network for sandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" successfully" Mar 7 00:57:11.383500 containerd[2014]: time="2026-03-07T00:57:11.383428569Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:11.383679 containerd[2014]: time="2026-03-07T00:57:11.383532321Z" level=info msg="RemovePodSandbox \"2f666358e380f03f7731a53ea216f623ebb74eaf6b0980cd676cb97841b1a597\" returns successfully" Mar 7 00:57:11.384830 containerd[2014]: time="2026-03-07T00:57:11.384764637Z" level=info msg="StopPodSandbox for \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\"" Mar 7 00:57:11.714385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount837153522.mount: Deactivated successfully. Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.604 [WARNING][6152] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"4134feb0-74d5-456d-b147-4d748b45b820", ResourceVersion:"1121", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809", Pod:"calico-apiserver-5f7f748cc6-267tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali05c656191a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.605 [INFO][6152] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.605 [INFO][6152] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" iface="eth0" netns="" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.607 [INFO][6152] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.607 [INFO][6152] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.807 [INFO][6162] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.808 [INFO][6162] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.809 [INFO][6162] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.834 [WARNING][6162] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.834 [INFO][6162] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.840 [INFO][6162] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:11.853869 containerd[2014]: 2026-03-07 00:57:11.847 [INFO][6152] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:11.855866 containerd[2014]: time="2026-03-07T00:57:11.855146099Z" level=info msg="TearDown network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" successfully" Mar 7 00:57:11.855866 containerd[2014]: time="2026-03-07T00:57:11.855635735Z" level=info msg="StopPodSandbox for \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" returns successfully" Mar 7 00:57:11.857611 containerd[2014]: time="2026-03-07T00:57:11.857559324Z" level=info msg="RemovePodSandbox for \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\"" Mar 7 00:57:11.858552 containerd[2014]: time="2026-03-07T00:57:11.858239928Z" level=info msg="Forcibly stopping sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\"" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.089 [WARNING][6186] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0", GenerateName:"calico-apiserver-5f7f748cc6-", Namespace:"calico-system", SelfLink:"", UID:"4134feb0-74d5-456d-b147-4d748b45b820", ResourceVersion:"1121", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f7f748cc6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"f4d70e079c4b2ecfb7b8d32eb417fc0fe1177fcf602e778b12269ea4d6e88809", Pod:"calico-apiserver-5f7f748cc6-267tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.31.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali05c656191a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.090 [INFO][6186] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.091 [INFO][6186] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" iface="eth0" netns="" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.091 [INFO][6186] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.091 [INFO][6186] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.249 [INFO][6194] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.250 [INFO][6194] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.250 [INFO][6194] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.283 [WARNING][6194] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.283 [INFO][6194] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" HandleID="k8s-pod-network.8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Workload="ip--172--31--19--200-k8s-calico--apiserver--5f7f748cc6--267tw-eth0" Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.291 [INFO][6194] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:12.308967 containerd[2014]: 2026-03-07 00:57:12.299 [INFO][6186] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378" Mar 7 00:57:12.313590 containerd[2014]: time="2026-03-07T00:57:12.309025366Z" level=info msg="TearDown network for sandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" successfully" Mar 7 00:57:12.323652 containerd[2014]: time="2026-03-07T00:57:12.323466250Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:12.323652 containerd[2014]: time="2026-03-07T00:57:12.323576374Z" level=info msg="RemovePodSandbox \"8797f3d890f6bbd84b88c2ad3da2f9919f3b2a06c27dceb917066120a8a56378\" returns successfully" Mar 7 00:57:12.324317 containerd[2014]: time="2026-03-07T00:57:12.324259102Z" level=info msg="StopPodSandbox for \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\"" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.449 [WARNING][6208] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"ab0a4538-d7e9-47a6-8763-f14115b5894f", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e", Pod:"goldmane-5b85766d88-nbqjk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5cf34869ee4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.451 [INFO][6208] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.453 [INFO][6208] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" iface="eth0" netns="" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.453 [INFO][6208] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.453 [INFO][6208] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.606 [INFO][6216] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.607 [INFO][6216] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.607 [INFO][6216] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.662 [WARNING][6216] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.662 [INFO][6216] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.669 [INFO][6216] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:12.688507 containerd[2014]: 2026-03-07 00:57:12.677 [INFO][6208] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:12.688507 containerd[2014]: time="2026-03-07T00:57:12.688386924Z" level=info msg="TearDown network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" successfully" Mar 7 00:57:12.688507 containerd[2014]: time="2026-03-07T00:57:12.688424592Z" level=info msg="StopPodSandbox for \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" returns successfully" Mar 7 00:57:12.693415 containerd[2014]: time="2026-03-07T00:57:12.690992172Z" level=info msg="RemovePodSandbox for \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\"" Mar 7 00:57:12.693415 containerd[2014]: time="2026-03-07T00:57:12.691057248Z" level=info msg="Forcibly stopping sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\"" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:12.939 [WARNING][6233] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"ab0a4538-d7e9-47a6-8763-f14115b5894f", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-19-200", ContainerID:"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e", Pod:"goldmane-5b85766d88-nbqjk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.31.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5cf34869ee4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:12.940 [INFO][6233] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:12.940 [INFO][6233] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" iface="eth0" netns="" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:12.940 [INFO][6233] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:12.940 [INFO][6233] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.071 [INFO][6241] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.076 [INFO][6241] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.076 [INFO][6241] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.112 [WARNING][6241] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.112 [INFO][6241] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" HandleID="k8s-pod-network.220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Workload="ip--172--31--19--200-k8s-goldmane--5b85766d88--nbqjk-eth0" Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.120 [INFO][6241] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:57:13.146009 containerd[2014]: 2026-03-07 00:57:13.129 [INFO][6233] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887" Mar 7 00:57:13.146009 containerd[2014]: time="2026-03-07T00:57:13.144755734Z" level=info msg="TearDown network for sandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" successfully" Mar 7 00:57:13.163453 containerd[2014]: time="2026-03-07T00:57:13.163004674Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:57:13.163453 containerd[2014]: time="2026-03-07T00:57:13.163116874Z" level=info msg="RemovePodSandbox \"220294cbfe63c5abb8316c1b23eb2bd0867ee4c9a28b11f87135b6a7fe7f8887\" returns successfully" Mar 7 00:57:13.275755 containerd[2014]: time="2026-03-07T00:57:13.275664899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:13.278758 containerd[2014]: time="2026-03-07T00:57:13.278696927Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 7 00:57:13.281545 containerd[2014]: time="2026-03-07T00:57:13.281479175Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:13.288125 containerd[2014]: time="2026-03-07T00:57:13.288056567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:13.290467 containerd[2014]: time="2026-03-07T00:57:13.290394215Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 5.418416931s" Mar 7 00:57:13.290467 containerd[2014]: time="2026-03-07T00:57:13.290455991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 7 00:57:13.293894 containerd[2014]: time="2026-03-07T00:57:13.293590403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 00:57:13.300658 containerd[2014]: time="2026-03-07T00:57:13.300594035Z" level=info msg="CreateContainer within sandbox \"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 00:57:13.325709 containerd[2014]: time="2026-03-07T00:57:13.325650491Z" level=info msg="CreateContainer within sandbox \"9323330514249d12186c1f76d6093fd1dc5c9ef529ce12410c3efedd757c813e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"b803127db5965c20b451a917a21a9f657e03f3781240302c753447d3c322bd66\"" Mar 7 00:57:13.329325 containerd[2014]: time="2026-03-07T00:57:13.328923719Z" level=info msg="StartContainer for \"b803127db5965c20b451a917a21a9f657e03f3781240302c753447d3c322bd66\"" Mar 7 00:57:13.402518 systemd[1]: Started cri-containerd-b803127db5965c20b451a917a21a9f657e03f3781240302c753447d3c322bd66.scope - libcontainer container b803127db5965c20b451a917a21a9f657e03f3781240302c753447d3c322bd66. Mar 7 00:57:13.491555 containerd[2014]: time="2026-03-07T00:57:13.491481300Z" level=info msg="StartContainer for \"b803127db5965c20b451a917a21a9f657e03f3781240302c753447d3c322bd66\" returns successfully" Mar 7 00:57:13.662687 kubelet[3248]: I0307 00:57:13.662517 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-nbqjk" podStartSLOduration=29.270593936 podStartE2EDuration="43.662494428s" podCreationTimestamp="2026-03-07 00:56:30 +0000 UTC" firstStartedPulling="2026-03-07 00:56:58.901165067 +0000 UTC m=+51.419191180" lastFinishedPulling="2026-03-07 00:57:13.293065547 +0000 UTC m=+65.811091672" observedRunningTime="2026-03-07 00:57:13.656058888 +0000 UTC m=+66.174085025" watchObservedRunningTime="2026-03-07 00:57:13.662494428 +0000 UTC m=+66.180520553" Mar 7 00:57:14.650572 containerd[2014]: time="2026-03-07T00:57:14.650514505Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:14.654410 containerd[2014]: time="2026-03-07T00:57:14.654316921Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 7 00:57:14.660689 containerd[2014]: time="2026-03-07T00:57:14.657486865Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:14.672157 containerd[2014]: time="2026-03-07T00:57:14.669863185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:14.676485 containerd[2014]: time="2026-03-07T00:57:14.674798077Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.381136682s" Mar 7 00:57:14.676485 containerd[2014]: time="2026-03-07T00:57:14.676302170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 7 00:57:14.695226 containerd[2014]: time="2026-03-07T00:57:14.694102898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 00:57:14.709378 containerd[2014]: time="2026-03-07T00:57:14.709327190Z" level=info msg="CreateContainer within sandbox \"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 00:57:14.751610 containerd[2014]: time="2026-03-07T00:57:14.751547822Z" level=info msg="CreateContainer within sandbox \"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9\"" Mar 7 00:57:14.752694 containerd[2014]: time="2026-03-07T00:57:14.752634422Z" level=info msg="StartContainer for \"af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9\"" Mar 7 00:57:14.839631 systemd[1]: run-containerd-runc-k8s.io-af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9-runc.00Md2J.mount: Deactivated successfully. Mar 7 00:57:14.857525 systemd[1]: Started cri-containerd-af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9.scope - libcontainer container af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9. Mar 7 00:57:14.984776 containerd[2014]: time="2026-03-07T00:57:14.984709431Z" level=info msg="StartContainer for \"af4d792cc2cee4738b3f7abf2924c66cee66f05ae3ba06a679e5a203638910c9\" returns successfully" Mar 7 00:57:15.012734 systemd[1]: Started sshd@9-172.31.19.200:22-20.161.92.111:45592.service - OpenSSH per-connection server daemon (20.161.92.111:45592). Mar 7 00:57:15.544363 sshd[6351]: Accepted publickey for core from 20.161.92.111 port 45592 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:15.549264 sshd[6351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:15.559763 systemd-logind[2003]: New session 10 of user core. Mar 7 00:57:15.568737 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 00:57:16.191322 sshd[6351]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:16.207526 systemd[1]: sshd@9-172.31.19.200:22-20.161.92.111:45592.service: Deactivated successfully. Mar 7 00:57:16.217257 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 00:57:16.221731 systemd-logind[2003]: Session 10 logged out. Waiting for processes to exit. Mar 7 00:57:16.228873 systemd-logind[2003]: Removed session 10. Mar 7 00:57:16.557706 containerd[2014]: time="2026-03-07T00:57:16.557160639Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:16.563064 containerd[2014]: time="2026-03-07T00:57:16.562895931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 7 00:57:16.567322 containerd[2014]: time="2026-03-07T00:57:16.567219615Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:16.579224 containerd[2014]: time="2026-03-07T00:57:16.577762875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:16.579871 containerd[2014]: time="2026-03-07T00:57:16.579816879Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.884183597s" Mar 7 00:57:16.580035 containerd[2014]: time="2026-03-07T00:57:16.580001295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 7 00:57:16.585551 containerd[2014]: time="2026-03-07T00:57:16.585501291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 00:57:16.595695 containerd[2014]: time="2026-03-07T00:57:16.595641135Z" level=info msg="CreateContainer within sandbox \"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 00:57:16.640084 containerd[2014]: time="2026-03-07T00:57:16.639966663Z" level=info msg="CreateContainer within sandbox \"68713494437926089a15adad47099024fe5db42e04848de4a6090ffed3f19298\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"1962fe2f7bbc80918e4a8d63d91c4541c0558aa6f192d9b3f059a551f86a9634\"" Mar 7 00:57:16.641152 containerd[2014]: time="2026-03-07T00:57:16.641105427Z" level=info msg="StartContainer for \"1962fe2f7bbc80918e4a8d63d91c4541c0558aa6f192d9b3f059a551f86a9634\"" Mar 7 00:57:16.746566 systemd[1]: Started cri-containerd-1962fe2f7bbc80918e4a8d63d91c4541c0558aa6f192d9b3f059a551f86a9634.scope - libcontainer container 1962fe2f7bbc80918e4a8d63d91c4541c0558aa6f192d9b3f059a551f86a9634. Mar 7 00:57:16.873075 containerd[2014]: time="2026-03-07T00:57:16.872939584Z" level=info msg="StartContainer for \"1962fe2f7bbc80918e4a8d63d91c4541c0558aa6f192d9b3f059a551f86a9634\" returns successfully" Mar 7 00:57:17.688925 kubelet[3248]: I0307 00:57:17.688796 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-6p9l4" podStartSLOduration=22.540509695 podStartE2EDuration="43.688770328s" podCreationTimestamp="2026-03-07 00:56:34 +0000 UTC" firstStartedPulling="2026-03-07 00:56:55.436712202 +0000 UTC m=+47.954738327" lastFinishedPulling="2026-03-07 00:57:16.584972847 +0000 UTC m=+69.102998960" observedRunningTime="2026-03-07 00:57:17.687569836 +0000 UTC m=+70.205596033" watchObservedRunningTime="2026-03-07 00:57:17.688770328 +0000 UTC m=+70.206796465" Mar 7 00:57:17.956413 kubelet[3248]: I0307 00:57:17.956149 3248 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 00:57:17.956413 kubelet[3248]: I0307 00:57:17.956248 3248 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 00:57:18.244849 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1513291085.mount: Deactivated successfully. Mar 7 00:57:18.280261 containerd[2014]: time="2026-03-07T00:57:18.279869175Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:18.282838 containerd[2014]: time="2026-03-07T00:57:18.282760935Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 7 00:57:18.285285 containerd[2014]: time="2026-03-07T00:57:18.284567067Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:18.289744 containerd[2014]: time="2026-03-07T00:57:18.289673559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:57:18.296124 containerd[2014]: time="2026-03-07T00:57:18.296041503Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.710276068s" Mar 7 00:57:18.296329 containerd[2014]: time="2026-03-07T00:57:18.296120535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 7 00:57:18.308164 containerd[2014]: time="2026-03-07T00:57:18.307895668Z" level=info msg="CreateContainer within sandbox \"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 00:57:18.345716 containerd[2014]: time="2026-03-07T00:57:18.345641488Z" level=info msg="CreateContainer within sandbox \"7fcd9baf0ef6c5d7a488d93ff1b49895b7b8c9238ed4f0a9fc8709c41bd7a8a8\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"2bb95e89ab5bcb921fc9d26d8b35c96d10e5db7b5662eadf7348d3ed70583f23\"" Mar 7 00:57:18.347363 containerd[2014]: time="2026-03-07T00:57:18.346397404Z" level=info msg="StartContainer for \"2bb95e89ab5bcb921fc9d26d8b35c96d10e5db7b5662eadf7348d3ed70583f23\"" Mar 7 00:57:18.347176 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount654325648.mount: Deactivated successfully. Mar 7 00:57:18.441578 systemd[1]: Started cri-containerd-2bb95e89ab5bcb921fc9d26d8b35c96d10e5db7b5662eadf7348d3ed70583f23.scope - libcontainer container 2bb95e89ab5bcb921fc9d26d8b35c96d10e5db7b5662eadf7348d3ed70583f23. Mar 7 00:57:18.602695 containerd[2014]: time="2026-03-07T00:57:18.602548769Z" level=info msg="StartContainer for \"2bb95e89ab5bcb921fc9d26d8b35c96d10e5db7b5662eadf7348d3ed70583f23\" returns successfully" Mar 7 00:57:18.693578 kubelet[3248]: I0307 00:57:18.693250 3248 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-56f5bf9455-h6wbh" podStartSLOduration=3.5898267710000002 podStartE2EDuration="22.693224405s" podCreationTimestamp="2026-03-07 00:56:56 +0000 UTC" firstStartedPulling="2026-03-07 00:56:59.194557413 +0000 UTC m=+51.712583538" lastFinishedPulling="2026-03-07 00:57:18.297955047 +0000 UTC m=+70.815981172" observedRunningTime="2026-03-07 00:57:18.691916777 +0000 UTC m=+71.209942914" watchObservedRunningTime="2026-03-07 00:57:18.693224405 +0000 UTC m=+71.211250566" Mar 7 00:57:21.285751 systemd[1]: Started sshd@10-172.31.19.200:22-20.161.92.111:58246.service - OpenSSH per-connection server daemon (20.161.92.111:58246). Mar 7 00:57:21.811402 sshd[6482]: Accepted publickey for core from 20.161.92.111 port 58246 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:21.818849 sshd[6482]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:21.833114 systemd-logind[2003]: New session 11 of user core. Mar 7 00:57:21.838580 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 00:57:22.306872 sshd[6482]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:22.314448 systemd[1]: sshd@10-172.31.19.200:22-20.161.92.111:58246.service: Deactivated successfully. Mar 7 00:57:22.320810 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 00:57:22.322838 systemd-logind[2003]: Session 11 logged out. Waiting for processes to exit. Mar 7 00:57:22.324956 systemd-logind[2003]: Removed session 11. Mar 7 00:57:27.407985 systemd[1]: Started sshd@11-172.31.19.200:22-20.161.92.111:58252.service - OpenSSH per-connection server daemon (20.161.92.111:58252). Mar 7 00:57:27.922179 sshd[6527]: Accepted publickey for core from 20.161.92.111 port 58252 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:27.928929 sshd[6527]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:27.949131 systemd-logind[2003]: New session 12 of user core. Mar 7 00:57:27.955653 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 00:57:28.430061 sshd[6527]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:28.438457 systemd[1]: sshd@11-172.31.19.200:22-20.161.92.111:58252.service: Deactivated successfully. Mar 7 00:57:28.446025 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 00:57:28.448529 systemd-logind[2003]: Session 12 logged out. Waiting for processes to exit. Mar 7 00:57:28.454351 systemd-logind[2003]: Removed session 12. Mar 7 00:57:28.525042 systemd[1]: Started sshd@12-172.31.19.200:22-20.161.92.111:58266.service - OpenSSH per-connection server daemon (20.161.92.111:58266). Mar 7 00:57:29.040489 sshd[6542]: Accepted publickey for core from 20.161.92.111 port 58266 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:29.043344 sshd[6542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:29.052634 systemd-logind[2003]: New session 13 of user core. Mar 7 00:57:29.064464 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 00:57:29.634837 sshd[6542]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:29.649144 systemd[1]: sshd@12-172.31.19.200:22-20.161.92.111:58266.service: Deactivated successfully. Mar 7 00:57:29.655672 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 00:57:29.658860 systemd-logind[2003]: Session 13 logged out. Waiting for processes to exit. Mar 7 00:57:29.663303 systemd-logind[2003]: Removed session 13. Mar 7 00:57:29.730741 systemd[1]: Started sshd@13-172.31.19.200:22-20.161.92.111:58270.service - OpenSSH per-connection server daemon (20.161.92.111:58270). Mar 7 00:57:30.241980 sshd[6553]: Accepted publickey for core from 20.161.92.111 port 58270 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:30.243758 sshd[6553]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:30.253763 systemd-logind[2003]: New session 14 of user core. Mar 7 00:57:30.259508 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 00:57:30.743558 sshd[6553]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:30.751011 systemd[1]: sshd@13-172.31.19.200:22-20.161.92.111:58270.service: Deactivated successfully. Mar 7 00:57:30.756300 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 00:57:30.758097 systemd-logind[2003]: Session 14 logged out. Waiting for processes to exit. Mar 7 00:57:30.760021 systemd-logind[2003]: Removed session 14. Mar 7 00:57:35.847376 systemd[1]: Started sshd@14-172.31.19.200:22-20.161.92.111:60944.service - OpenSSH per-connection server daemon (20.161.92.111:60944). Mar 7 00:57:36.356276 sshd[6612]: Accepted publickey for core from 20.161.92.111 port 60944 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:36.358670 sshd[6612]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:36.369157 systemd-logind[2003]: New session 15 of user core. Mar 7 00:57:36.375475 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 00:57:36.848524 sshd[6612]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:36.854978 systemd[1]: sshd@14-172.31.19.200:22-20.161.92.111:60944.service: Deactivated successfully. Mar 7 00:57:36.860643 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 00:57:36.862497 systemd-logind[2003]: Session 15 logged out. Waiting for processes to exit. Mar 7 00:57:36.864700 systemd-logind[2003]: Removed session 15. Mar 7 00:57:36.943786 systemd[1]: Started sshd@15-172.31.19.200:22-20.161.92.111:60960.service - OpenSSH per-connection server daemon (20.161.92.111:60960). Mar 7 00:57:37.446653 sshd[6625]: Accepted publickey for core from 20.161.92.111 port 60960 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:37.449491 sshd[6625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:37.458581 systemd-logind[2003]: New session 16 of user core. Mar 7 00:57:37.465491 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 00:57:38.284353 sshd[6625]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:38.292813 systemd[1]: sshd@15-172.31.19.200:22-20.161.92.111:60960.service: Deactivated successfully. Mar 7 00:57:38.298423 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 00:57:38.301249 systemd-logind[2003]: Session 16 logged out. Waiting for processes to exit. Mar 7 00:57:38.303802 systemd-logind[2003]: Removed session 16. Mar 7 00:57:38.376707 systemd[1]: Started sshd@16-172.31.19.200:22-20.161.92.111:60970.service - OpenSSH per-connection server daemon (20.161.92.111:60970). Mar 7 00:57:38.902351 sshd[6636]: Accepted publickey for core from 20.161.92.111 port 60970 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:38.905119 sshd[6636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:38.914464 systemd-logind[2003]: New session 17 of user core. Mar 7 00:57:38.919460 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 00:57:40.235690 sshd[6636]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:40.247896 systemd[1]: sshd@16-172.31.19.200:22-20.161.92.111:60970.service: Deactivated successfully. Mar 7 00:57:40.255636 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 00:57:40.263275 systemd-logind[2003]: Session 17 logged out. Waiting for processes to exit. Mar 7 00:57:40.267682 systemd-logind[2003]: Removed session 17. Mar 7 00:57:40.335922 systemd[1]: Started sshd@17-172.31.19.200:22-20.161.92.111:54208.service - OpenSSH per-connection server daemon (20.161.92.111:54208). Mar 7 00:57:40.840031 sshd[6664]: Accepted publickey for core from 20.161.92.111 port 54208 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:40.841792 sshd[6664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:40.850912 systemd-logind[2003]: New session 18 of user core. Mar 7 00:57:40.857482 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 00:57:41.612503 sshd[6664]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:41.621712 systemd-logind[2003]: Session 18 logged out. Waiting for processes to exit. Mar 7 00:57:41.626010 systemd[1]: sshd@17-172.31.19.200:22-20.161.92.111:54208.service: Deactivated successfully. Mar 7 00:57:41.636449 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 00:57:41.641786 systemd-logind[2003]: Removed session 18. Mar 7 00:57:41.716743 systemd[1]: Started sshd@18-172.31.19.200:22-20.161.92.111:54222.service - OpenSSH per-connection server daemon (20.161.92.111:54222). Mar 7 00:57:42.240266 sshd[6687]: Accepted publickey for core from 20.161.92.111 port 54222 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:42.242001 sshd[6687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:42.251535 systemd-logind[2003]: New session 19 of user core. Mar 7 00:57:42.255477 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 00:57:42.715623 sshd[6687]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:42.722152 systemd[1]: sshd@18-172.31.19.200:22-20.161.92.111:54222.service: Deactivated successfully. Mar 7 00:57:42.730444 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 00:57:42.733530 systemd-logind[2003]: Session 19 logged out. Waiting for processes to exit. Mar 7 00:57:42.738006 systemd-logind[2003]: Removed session 19. Mar 7 00:57:47.812980 systemd[1]: Started sshd@19-172.31.19.200:22-20.161.92.111:54238.service - OpenSSH per-connection server daemon (20.161.92.111:54238). Mar 7 00:57:48.315732 sshd[6724]: Accepted publickey for core from 20.161.92.111 port 54238 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:48.318523 sshd[6724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:48.327310 systemd-logind[2003]: New session 20 of user core. Mar 7 00:57:48.334474 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 7 00:57:48.798504 sshd[6724]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:48.804852 systemd[1]: sshd@19-172.31.19.200:22-20.161.92.111:54238.service: Deactivated successfully. Mar 7 00:57:48.810640 systemd[1]: session-20.scope: Deactivated successfully. Mar 7 00:57:48.812392 systemd-logind[2003]: Session 20 logged out. Waiting for processes to exit. Mar 7 00:57:48.815798 systemd-logind[2003]: Removed session 20. Mar 7 00:57:53.901600 systemd[1]: Started sshd@20-172.31.19.200:22-20.161.92.111:35286.service - OpenSSH per-connection server daemon (20.161.92.111:35286). Mar 7 00:57:54.404474 sshd[6736]: Accepted publickey for core from 20.161.92.111 port 35286 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:57:54.407953 sshd[6736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:54.417849 systemd-logind[2003]: New session 21 of user core. Mar 7 00:57:54.424448 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 7 00:57:54.898848 sshd[6736]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:54.906452 systemd[1]: sshd@20-172.31.19.200:22-20.161.92.111:35286.service: Deactivated successfully. Mar 7 00:57:54.910558 systemd[1]: session-21.scope: Deactivated successfully. Mar 7 00:57:54.912728 systemd-logind[2003]: Session 21 logged out. Waiting for processes to exit. Mar 7 00:57:54.915379 systemd-logind[2003]: Removed session 21. Mar 7 00:58:00.001508 systemd[1]: Started sshd@21-172.31.19.200:22-20.161.92.111:35292.service - OpenSSH per-connection server daemon (20.161.92.111:35292). Mar 7 00:58:00.529239 sshd[6748]: Accepted publickey for core from 20.161.92.111 port 35292 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:58:00.533080 sshd[6748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:58:00.544302 systemd-logind[2003]: New session 22 of user core. Mar 7 00:58:00.554527 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 7 00:58:01.067596 sshd[6748]: pam_unix(sshd:session): session closed for user core Mar 7 00:58:01.075496 systemd[1]: session-22.scope: Deactivated successfully. Mar 7 00:58:01.081980 systemd[1]: sshd@21-172.31.19.200:22-20.161.92.111:35292.service: Deactivated successfully. Mar 7 00:58:01.082372 systemd-logind[2003]: Session 22 logged out. Waiting for processes to exit. Mar 7 00:58:01.095998 systemd-logind[2003]: Removed session 22. Mar 7 00:58:03.115293 systemd[1]: run-containerd-runc-k8s.io-1fe17442355ceb5173451e0735ae890d944018c871c49d6d9d38ba4452d0c9ae-runc.QNWV4a.mount: Deactivated successfully. Mar 7 00:58:06.160512 systemd[1]: Started sshd@22-172.31.19.200:22-20.161.92.111:53642.service - OpenSSH per-connection server daemon (20.161.92.111:53642). Mar 7 00:58:06.685741 sshd[6844]: Accepted publickey for core from 20.161.92.111 port 53642 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:58:06.688461 sshd[6844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:58:06.697842 systemd-logind[2003]: New session 23 of user core. Mar 7 00:58:06.707491 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 7 00:58:07.167589 sshd[6844]: pam_unix(sshd:session): session closed for user core Mar 7 00:58:07.173859 systemd-logind[2003]: Session 23 logged out. Waiting for processes to exit. Mar 7 00:58:07.176027 systemd[1]: sshd@22-172.31.19.200:22-20.161.92.111:53642.service: Deactivated successfully. Mar 7 00:58:07.179745 systemd[1]: session-23.scope: Deactivated successfully. Mar 7 00:58:07.182333 systemd-logind[2003]: Removed session 23. Mar 7 00:58:12.261711 systemd[1]: Started sshd@23-172.31.19.200:22-20.161.92.111:49922.service - OpenSSH per-connection server daemon (20.161.92.111:49922). Mar 7 00:58:12.776888 sshd[6859]: Accepted publickey for core from 20.161.92.111 port 49922 ssh2: RSA SHA256:CACtkjS64SwL0ouDnrWRH1vlyxIcwr6xT7re/CsaoWw Mar 7 00:58:12.780227 sshd[6859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:58:12.788129 systemd-logind[2003]: New session 24 of user core. Mar 7 00:58:12.800505 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 7 00:58:13.261276 sshd[6859]: pam_unix(sshd:session): session closed for user core Mar 7 00:58:13.268048 systemd[1]: sshd@23-172.31.19.200:22-20.161.92.111:49922.service: Deactivated successfully. Mar 7 00:58:13.272418 systemd[1]: session-24.scope: Deactivated successfully. Mar 7 00:58:13.274558 systemd-logind[2003]: Session 24 logged out. Waiting for processes to exit. Mar 7 00:58:13.277733 systemd-logind[2003]: Removed session 24. Mar 7 00:58:28.566445 systemd[1]: cri-containerd-183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5.scope: Deactivated successfully. Mar 7 00:58:28.568504 systemd[1]: cri-containerd-183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5.scope: Consumed 22.103s CPU time. Mar 7 00:58:28.610661 containerd[2014]: time="2026-03-07T00:58:28.610559953Z" level=info msg="shim disconnected" id=183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5 namespace=k8s.io Mar 7 00:58:28.610661 containerd[2014]: time="2026-03-07T00:58:28.610644673Z" level=warning msg="cleaning up after shim disconnected" id=183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5 namespace=k8s.io Mar 7 00:58:28.611372 containerd[2014]: time="2026-03-07T00:58:28.610666537Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:58:28.613657 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5-rootfs.mount: Deactivated successfully. Mar 7 00:58:28.652077 systemd[1]: cri-containerd-01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09.scope: Deactivated successfully. Mar 7 00:58:28.652582 systemd[1]: cri-containerd-01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09.scope: Consumed 7.434s CPU time, 18.4M memory peak, 0B memory swap peak. Mar 7 00:58:28.710230 containerd[2014]: time="2026-03-07T00:58:28.708033253Z" level=info msg="shim disconnected" id=01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09 namespace=k8s.io Mar 7 00:58:28.710230 containerd[2014]: time="2026-03-07T00:58:28.708108217Z" level=warning msg="cleaning up after shim disconnected" id=01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09 namespace=k8s.io Mar 7 00:58:28.710230 containerd[2014]: time="2026-03-07T00:58:28.708128737Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:58:28.712886 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09-rootfs.mount: Deactivated successfully. Mar 7 00:58:28.903812 kubelet[3248]: I0307 00:58:28.902771 3248 scope.go:117] "RemoveContainer" containerID="183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5" Mar 7 00:58:28.907418 containerd[2014]: time="2026-03-07T00:58:28.907346354Z" level=info msg="CreateContainer within sandbox \"8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 7 00:58:28.909676 kubelet[3248]: I0307 00:58:28.909618 3248 scope.go:117] "RemoveContainer" containerID="01f965f5e800ed760cd7c084f88ffec9d105e49ca7ac33df9d281c78bc135e09" Mar 7 00:58:28.915091 containerd[2014]: time="2026-03-07T00:58:28.914898578Z" level=info msg="CreateContainer within sandbox \"be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 7 00:58:28.941235 containerd[2014]: time="2026-03-07T00:58:28.941126426Z" level=info msg="CreateContainer within sandbox \"8e41c291c8a9288492431138c39cb56934e87b6919a4bcfe03b426cd5d165155\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843\"" Mar 7 00:58:28.944396 containerd[2014]: time="2026-03-07T00:58:28.942750794Z" level=info msg="StartContainer for \"f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843\"" Mar 7 00:58:28.964962 containerd[2014]: time="2026-03-07T00:58:28.964901306Z" level=info msg="CreateContainer within sandbox \"be43b923b4d2ae8c624944b7e6249c44ea07effa0f494364739b347631b2ad29\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bb02e57a4fd3f9670efec849cce2f53fa0036e7b70ecd47f9051ae7df087c50f\"" Mar 7 00:58:28.966456 containerd[2014]: time="2026-03-07T00:58:28.966387879Z" level=info msg="StartContainer for \"bb02e57a4fd3f9670efec849cce2f53fa0036e7b70ecd47f9051ae7df087c50f\"" Mar 7 00:58:29.003545 systemd[1]: Started cri-containerd-f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843.scope - libcontainer container f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843. Mar 7 00:58:29.040464 systemd[1]: Started cri-containerd-bb02e57a4fd3f9670efec849cce2f53fa0036e7b70ecd47f9051ae7df087c50f.scope - libcontainer container bb02e57a4fd3f9670efec849cce2f53fa0036e7b70ecd47f9051ae7df087c50f. Mar 7 00:58:29.109496 containerd[2014]: time="2026-03-07T00:58:29.108957227Z" level=info msg="StartContainer for \"f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843\" returns successfully" Mar 7 00:58:29.137644 containerd[2014]: time="2026-03-07T00:58:29.137571395Z" level=info msg="StartContainer for \"bb02e57a4fd3f9670efec849cce2f53fa0036e7b70ecd47f9051ae7df087c50f\" returns successfully" Mar 7 00:58:29.871687 kubelet[3248]: E0307 00:58:29.871619 3248 controller.go:195] "Failed to update lease" err="Put \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 7 00:58:34.957866 systemd[1]: cri-containerd-ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e.scope: Deactivated successfully. Mar 7 00:58:34.958366 systemd[1]: cri-containerd-ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e.scope: Consumed 4.247s CPU time, 14.1M memory peak, 0B memory swap peak. Mar 7 00:58:35.003277 containerd[2014]: time="2026-03-07T00:58:35.001884064Z" level=info msg="shim disconnected" id=ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e namespace=k8s.io Mar 7 00:58:35.003277 containerd[2014]: time="2026-03-07T00:58:35.002005372Z" level=warning msg="cleaning up after shim disconnected" id=ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e namespace=k8s.io Mar 7 00:58:35.003277 containerd[2014]: time="2026-03-07T00:58:35.002028892Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:58:35.009929 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e-rootfs.mount: Deactivated successfully. Mar 7 00:58:35.530022 systemd[1]: run-containerd-runc-k8s.io-09dc441534efc587a70fa9b2987c2a2dd5bc6bb33e10a40ad1e29281d3359e5b-runc.BPDr5H.mount: Deactivated successfully. Mar 7 00:58:35.941842 kubelet[3248]: I0307 00:58:35.941521 3248 scope.go:117] "RemoveContainer" containerID="ccbc9fa623a6545e82e9bbd1569367a8cc364779606a34a348b2e5982de3ea0e" Mar 7 00:58:35.945873 containerd[2014]: time="2026-03-07T00:58:35.945766641Z" level=info msg="CreateContainer within sandbox \"4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 7 00:58:35.976982 containerd[2014]: time="2026-03-07T00:58:35.976901253Z" level=info msg="CreateContainer within sandbox \"4ed4c945d98a42545f7fd207957d549fec01b452aa30d9e9e6f59c9400cd3e39\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d\"" Mar 7 00:58:35.979487 containerd[2014]: time="2026-03-07T00:58:35.979112949Z" level=info msg="StartContainer for \"e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d\"" Mar 7 00:58:36.040586 systemd[1]: run-containerd-runc-k8s.io-e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d-runc.Vj1Bou.mount: Deactivated successfully. Mar 7 00:58:36.052532 systemd[1]: Started cri-containerd-e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d.scope - libcontainer container e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d. Mar 7 00:58:36.121797 containerd[2014]: time="2026-03-07T00:58:36.121629846Z" level=info msg="StartContainer for \"e7be981307330fe976e26af2e1609f5690d3b32e99274c9d3297d897f74c881d\" returns successfully" Mar 7 00:58:39.872681 kubelet[3248]: E0307 00:58:39.872605 3248 controller.go:195] "Failed to update lease" err="Put \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": context deadline exceeded" Mar 7 00:58:40.631013 systemd[1]: cri-containerd-f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843.scope: Deactivated successfully. Mar 7 00:58:40.672632 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843-rootfs.mount: Deactivated successfully. Mar 7 00:58:40.682593 containerd[2014]: time="2026-03-07T00:58:40.682512709Z" level=info msg="shim disconnected" id=f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843 namespace=k8s.io Mar 7 00:58:40.682593 containerd[2014]: time="2026-03-07T00:58:40.682589173Z" level=warning msg="cleaning up after shim disconnected" id=f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843 namespace=k8s.io Mar 7 00:58:40.683600 containerd[2014]: time="2026-03-07T00:58:40.682610941Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:58:40.961697 kubelet[3248]: I0307 00:58:40.961641 3248 scope.go:117] "RemoveContainer" containerID="f5cb4b3dd8fead0ef87b9b85f327579791d3328826c4e52293c34fdd64a3a843" Mar 7 00:58:40.963299 kubelet[3248]: E0307 00:58:40.963246 3248 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6bf85f8dd-7tlcl_tigera-operator(d616e452-89c8-4902-a4f6-09b65a643998)\"" pod="tigera-operator/tigera-operator-6bf85f8dd-7tlcl" podUID="d616e452-89c8-4902-a4f6-09b65a643998" Mar 7 00:58:40.963623 kubelet[3248]: I0307 00:58:40.963523 3248 scope.go:117] "RemoveContainer" containerID="183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5" Mar 7 00:58:40.966640 containerd[2014]: time="2026-03-07T00:58:40.966586466Z" level=info msg="RemoveContainer for \"183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5\"" Mar 7 00:58:40.973390 containerd[2014]: time="2026-03-07T00:58:40.973331378Z" level=info msg="RemoveContainer for \"183bdac68c5b3022aab87767c5bab814d4f7a8138fbfd3b571668c5beba1e1e5\" returns successfully" Mar 7 00:58:49.874216 kubelet[3248]: E0307 00:58:49.873376 3248 controller.go:195] "Failed to update lease" err="Put \"https://172.31.19.200:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-19-200?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"