Mar 17 17:52:22.145790 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 17 17:52:22.145834 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT Mon Mar 17 16:11:40 -00 2025 Mar 17 17:52:22.145859 kernel: KASLR disabled due to lack of seed Mar 17 17:52:22.145875 kernel: efi: EFI v2.7 by EDK II Mar 17 17:52:22.145890 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a736a98 MEMRESERVE=0x78557598 Mar 17 17:52:22.145905 kernel: secureboot: Secure boot disabled Mar 17 17:52:22.145922 kernel: ACPI: Early table checksum verification disabled Mar 17 17:52:22.145936 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 17 17:52:22.145951 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 17 17:52:22.145966 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 17 17:52:22.145986 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Mar 17 17:52:22.146001 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 17 17:52:22.146016 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 17 17:52:22.146031 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 17 17:52:22.146049 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 17 17:52:22.146069 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 17 17:52:22.146086 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 17 17:52:22.146101 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 17 17:52:22.146117 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 17 17:52:22.146133 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 17 17:52:22.146148 kernel: printk: bootconsole [uart0] enabled Mar 17 17:52:22.148233 kernel: NUMA: Failed to initialise from firmware Mar 17 17:52:22.148257 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 17 17:52:22.148274 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Mar 17 17:52:22.148290 kernel: Zone ranges: Mar 17 17:52:22.148306 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 17 17:52:22.148333 kernel: DMA32 empty Mar 17 17:52:22.148349 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 17 17:52:22.148365 kernel: Movable zone start for each node Mar 17 17:52:22.148380 kernel: Early memory node ranges Mar 17 17:52:22.148396 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 17 17:52:22.148412 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 17 17:52:22.148427 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 17 17:52:22.148443 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 17 17:52:22.148458 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 17 17:52:22.148474 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 17 17:52:22.148489 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 17 17:52:22.148504 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 17 17:52:22.148525 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 17 17:52:22.148541 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 17 17:52:22.148564 kernel: psci: probing for conduit method from ACPI. Mar 17 17:52:22.148581 kernel: psci: PSCIv1.0 detected in firmware. Mar 17 17:52:22.148597 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:52:22.148618 kernel: psci: Trusted OS migration not required Mar 17 17:52:22.148635 kernel: psci: SMC Calling Convention v1.1 Mar 17 17:52:22.148651 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:52:22.148667 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:52:22.148684 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 17:52:22.148701 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:52:22.148717 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:52:22.148734 kernel: CPU features: detected: Spectre-v2 Mar 17 17:52:22.148750 kernel: CPU features: detected: Spectre-v3a Mar 17 17:52:22.148767 kernel: CPU features: detected: Spectre-BHB Mar 17 17:52:22.148783 kernel: CPU features: detected: ARM erratum 1742098 Mar 17 17:52:22.148799 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 17 17:52:22.148821 kernel: alternatives: applying boot alternatives Mar 17 17:52:22.148839 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=f8298a09e890fc732131b7281e24befaf65b596eb5216e969c8eca4cab4a2b3a Mar 17 17:52:22.148857 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:52:22.148874 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:52:22.148891 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:52:22.148907 kernel: Fallback order for Node 0: 0 Mar 17 17:52:22.148924 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 17 17:52:22.148940 kernel: Policy zone: Normal Mar 17 17:52:22.148956 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:52:22.148972 kernel: software IO TLB: area num 2. Mar 17 17:52:22.148994 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 17 17:52:22.149011 kernel: Memory: 3821240K/4030464K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38336K init, 897K bss, 209224K reserved, 0K cma-reserved) Mar 17 17:52:22.149028 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 17:52:22.149045 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:52:22.149063 kernel: rcu: RCU event tracing is enabled. Mar 17 17:52:22.149080 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 17:52:22.149097 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:52:22.149114 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:52:22.149130 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:52:22.149147 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 17:52:22.149187 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:52:22.149212 kernel: GICv3: 96 SPIs implemented Mar 17 17:52:22.149229 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:52:22.149246 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:52:22.149262 kernel: GICv3: GICv3 features: 16 PPIs Mar 17 17:52:22.149279 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 17 17:52:22.149295 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 17 17:52:22.149312 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Mar 17 17:52:22.149329 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Mar 17 17:52:22.149346 kernel: GICv3: using LPI property table @0x00000004000d0000 Mar 17 17:52:22.149362 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 17 17:52:22.149379 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Mar 17 17:52:22.149395 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:52:22.149417 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 17 17:52:22.149434 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 17 17:52:22.149451 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 17 17:52:22.149468 kernel: Console: colour dummy device 80x25 Mar 17 17:52:22.149487 kernel: printk: console [tty1] enabled Mar 17 17:52:22.149504 kernel: ACPI: Core revision 20230628 Mar 17 17:52:22.149522 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 17 17:52:22.149541 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:52:22.149558 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:52:22.149575 kernel: landlock: Up and running. Mar 17 17:52:22.149598 kernel: SELinux: Initializing. Mar 17 17:52:22.149615 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:52:22.149632 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:52:22.149649 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:52:22.149666 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 17:52:22.149683 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:52:22.149701 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:52:22.149718 kernel: Platform MSI: ITS@0x10080000 domain created Mar 17 17:52:22.149739 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 17 17:52:22.149756 kernel: Remapping and enabling EFI services. Mar 17 17:52:22.149773 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:52:22.149790 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:52:22.149807 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 17 17:52:22.149824 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Mar 17 17:52:22.149841 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 17 17:52:22.149858 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 17:52:22.149874 kernel: SMP: Total of 2 processors activated. Mar 17 17:52:22.149891 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:52:22.149912 kernel: CPU features: detected: 32-bit EL1 Support Mar 17 17:52:22.149929 kernel: CPU features: detected: CRC32 instructions Mar 17 17:52:22.149957 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:52:22.149980 kernel: alternatives: applying system-wide alternatives Mar 17 17:52:22.149997 kernel: devtmpfs: initialized Mar 17 17:52:22.150015 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:52:22.150033 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 17:52:22.150050 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:52:22.150068 kernel: SMBIOS 3.0.0 present. Mar 17 17:52:22.150091 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 17 17:52:22.150108 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:52:22.150126 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:52:22.150144 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:52:22.152929 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:52:22.152964 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:52:22.152982 kernel: audit: type=2000 audit(0.218:1): state=initialized audit_enabled=0 res=1 Mar 17 17:52:22.153012 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:52:22.153030 kernel: cpuidle: using governor menu Mar 17 17:52:22.153048 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:52:22.153066 kernel: ASID allocator initialised with 65536 entries Mar 17 17:52:22.153084 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:52:22.153101 kernel: Serial: AMBA PL011 UART driver Mar 17 17:52:22.153119 kernel: Modules: 17760 pages in range for non-PLT usage Mar 17 17:52:22.153137 kernel: Modules: 509280 pages in range for PLT usage Mar 17 17:52:22.153286 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:52:22.153535 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:52:22.153557 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:52:22.153575 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:52:22.153594 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:52:22.153612 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:52:22.153630 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:52:22.153648 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:52:22.153666 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:52:22.153684 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:52:22.153708 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:52:22.153727 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:52:22.153745 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:52:22.153763 kernel: ACPI: Interpreter enabled Mar 17 17:52:22.153780 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:52:22.153798 kernel: ACPI: MCFG table detected, 1 entries Mar 17 17:52:22.153816 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Mar 17 17:52:22.154123 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:52:22.157805 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 17 17:52:22.158033 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 17 17:52:22.160119 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Mar 17 17:52:22.160398 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Mar 17 17:52:22.160427 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 17 17:52:22.160447 kernel: acpiphp: Slot [1] registered Mar 17 17:52:22.160466 kernel: acpiphp: Slot [2] registered Mar 17 17:52:22.160486 kernel: acpiphp: Slot [3] registered Mar 17 17:52:22.160518 kernel: acpiphp: Slot [4] registered Mar 17 17:52:22.160538 kernel: acpiphp: Slot [5] registered Mar 17 17:52:22.160558 kernel: acpiphp: Slot [6] registered Mar 17 17:52:22.160576 kernel: acpiphp: Slot [7] registered Mar 17 17:52:22.160594 kernel: acpiphp: Slot [8] registered Mar 17 17:52:22.160612 kernel: acpiphp: Slot [9] registered Mar 17 17:52:22.160631 kernel: acpiphp: Slot [10] registered Mar 17 17:52:22.160649 kernel: acpiphp: Slot [11] registered Mar 17 17:52:22.160666 kernel: acpiphp: Slot [12] registered Mar 17 17:52:22.160684 kernel: acpiphp: Slot [13] registered Mar 17 17:52:22.160707 kernel: acpiphp: Slot [14] registered Mar 17 17:52:22.160724 kernel: acpiphp: Slot [15] registered Mar 17 17:52:22.160742 kernel: acpiphp: Slot [16] registered Mar 17 17:52:22.160759 kernel: acpiphp: Slot [17] registered Mar 17 17:52:22.160777 kernel: acpiphp: Slot [18] registered Mar 17 17:52:22.160794 kernel: acpiphp: Slot [19] registered Mar 17 17:52:22.160812 kernel: acpiphp: Slot [20] registered Mar 17 17:52:22.160829 kernel: acpiphp: Slot [21] registered Mar 17 17:52:22.160846 kernel: acpiphp: Slot [22] registered Mar 17 17:52:22.160868 kernel: acpiphp: Slot [23] registered Mar 17 17:52:22.160887 kernel: acpiphp: Slot [24] registered Mar 17 17:52:22.160904 kernel: acpiphp: Slot [25] registered Mar 17 17:52:22.160921 kernel: acpiphp: Slot [26] registered Mar 17 17:52:22.160939 kernel: acpiphp: Slot [27] registered Mar 17 17:52:22.160956 kernel: acpiphp: Slot [28] registered Mar 17 17:52:22.160974 kernel: acpiphp: Slot [29] registered Mar 17 17:52:22.160991 kernel: acpiphp: Slot [30] registered Mar 17 17:52:22.161008 kernel: acpiphp: Slot [31] registered Mar 17 17:52:22.161026 kernel: PCI host bridge to bus 0000:00 Mar 17 17:52:22.161286 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 17 17:52:22.161475 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 17 17:52:22.161656 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 17 17:52:22.161838 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Mar 17 17:52:22.162091 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 17 17:52:22.164523 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 17 17:52:22.164774 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 17 17:52:22.164998 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 17 17:52:22.165237 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 17 17:52:22.165453 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 17:52:22.165681 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 17 17:52:22.165894 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 17 17:52:22.166101 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 17 17:52:22.168444 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 17 17:52:22.168670 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 17:52:22.168881 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Mar 17 17:52:22.169097 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Mar 17 17:52:22.169352 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Mar 17 17:52:22.169557 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Mar 17 17:52:22.169766 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Mar 17 17:52:22.169961 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 17 17:52:22.170141 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 17 17:52:22.171194 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 17 17:52:22.171224 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 17 17:52:22.171243 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 17 17:52:22.171262 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 17 17:52:22.171280 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 17 17:52:22.171297 kernel: iommu: Default domain type: Translated Mar 17 17:52:22.171325 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:52:22.171343 kernel: efivars: Registered efivars operations Mar 17 17:52:22.171360 kernel: vgaarb: loaded Mar 17 17:52:22.171378 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:52:22.171395 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:52:22.171413 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:52:22.171430 kernel: pnp: PnP ACPI init Mar 17 17:52:22.171645 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 17 17:52:22.171677 kernel: pnp: PnP ACPI: found 1 devices Mar 17 17:52:22.171695 kernel: NET: Registered PF_INET protocol family Mar 17 17:52:22.171714 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:52:22.171732 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:52:22.171750 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:52:22.171768 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:52:22.171786 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:52:22.171804 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:52:22.171821 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:52:22.171844 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:52:22.171862 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:52:22.171880 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:52:22.171898 kernel: kvm [1]: HYP mode not available Mar 17 17:52:22.171916 kernel: Initialise system trusted keyrings Mar 17 17:52:22.171934 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:52:22.171952 kernel: Key type asymmetric registered Mar 17 17:52:22.171970 kernel: Asymmetric key parser 'x509' registered Mar 17 17:52:22.171987 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:52:22.172010 kernel: io scheduler mq-deadline registered Mar 17 17:52:22.172028 kernel: io scheduler kyber registered Mar 17 17:52:22.172045 kernel: io scheduler bfq registered Mar 17 17:52:22.172291 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 17 17:52:22.172320 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 17 17:52:22.172338 kernel: ACPI: button: Power Button [PWRB] Mar 17 17:52:22.172356 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 17 17:52:22.172378 kernel: ACPI: button: Sleep Button [SLPB] Mar 17 17:52:22.172405 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:52:22.172424 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 17 17:52:22.172638 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 17 17:52:22.172664 kernel: printk: console [ttyS0] disabled Mar 17 17:52:22.172682 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 17 17:52:22.172700 kernel: printk: console [ttyS0] enabled Mar 17 17:52:22.172719 kernel: printk: bootconsole [uart0] disabled Mar 17 17:52:22.172771 kernel: thunder_xcv, ver 1.0 Mar 17 17:52:22.172811 kernel: thunder_bgx, ver 1.0 Mar 17 17:52:22.172830 kernel: nicpf, ver 1.0 Mar 17 17:52:22.172855 kernel: nicvf, ver 1.0 Mar 17 17:52:22.174357 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:52:22.174572 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:52:21 UTC (1742233941) Mar 17 17:52:22.174597 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:52:22.174616 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 17 17:52:22.174634 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:52:22.174652 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:52:22.174680 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:52:22.174717 kernel: Segment Routing with IPv6 Mar 17 17:52:22.174737 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:52:22.174755 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:52:22.174773 kernel: Key type dns_resolver registered Mar 17 17:52:22.174791 kernel: registered taskstats version 1 Mar 17 17:52:22.174808 kernel: Loading compiled-in X.509 certificates Mar 17 17:52:22.174827 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: f4ff2820cf7379ce82b759137d15b536f0a99b51' Mar 17 17:52:22.174845 kernel: Key type .fscrypt registered Mar 17 17:52:22.174862 kernel: Key type fscrypt-provisioning registered Mar 17 17:52:22.174887 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:52:22.174905 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:52:22.174924 kernel: ima: No architecture policies found Mar 17 17:52:22.174942 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:52:22.174960 kernel: clk: Disabling unused clocks Mar 17 17:52:22.174978 kernel: Freeing unused kernel memory: 38336K Mar 17 17:52:22.174996 kernel: Run /init as init process Mar 17 17:52:22.175014 kernel: with arguments: Mar 17 17:52:22.175032 kernel: /init Mar 17 17:52:22.175054 kernel: with environment: Mar 17 17:52:22.175072 kernel: HOME=/ Mar 17 17:52:22.175090 kernel: TERM=linux Mar 17 17:52:22.175107 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:52:22.175127 systemd[1]: Successfully made /usr/ read-only. Mar 17 17:52:22.175152 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:52:22.175203 systemd[1]: Detected virtualization amazon. Mar 17 17:52:22.175230 systemd[1]: Detected architecture arm64. Mar 17 17:52:22.175250 systemd[1]: Running in initrd. Mar 17 17:52:22.175271 systemd[1]: No hostname configured, using default hostname. Mar 17 17:52:22.175293 systemd[1]: Hostname set to . Mar 17 17:52:22.175312 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:52:22.175332 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:52:22.175352 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:52:22.175371 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:52:22.175392 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:52:22.175418 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:52:22.175437 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:52:22.175459 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:52:22.175482 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:52:22.175503 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:52:22.175523 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:52:22.175548 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:52:22.175568 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:52:22.175588 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:52:22.175607 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:52:22.175627 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:52:22.175647 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:52:22.175667 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:52:22.175687 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:52:22.175707 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 17 17:52:22.175732 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:52:22.175752 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:52:22.175772 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:52:22.175792 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:52:22.175811 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:52:22.175831 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:52:22.175850 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:52:22.175870 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:52:22.175894 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:52:22.175914 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:52:22.175934 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:52:22.175954 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:52:22.176024 systemd-journald[251]: Collecting audit messages is disabled. Mar 17 17:52:22.176075 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:52:22.176097 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:52:22.176116 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:52:22.176136 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:52:22.176183 kernel: Bridge firewalling registered Mar 17 17:52:22.176206 systemd-journald[251]: Journal started Mar 17 17:52:22.176244 systemd-journald[251]: Runtime Journal (/run/log/journal/ec26bd0039ed3c383c98c74e154879c6) is 8M, max 75.3M, 67.3M free. Mar 17 17:52:22.131109 systemd-modules-load[252]: Inserted module 'overlay' Mar 17 17:52:22.181961 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:52:22.175222 systemd-modules-load[252]: Inserted module 'br_netfilter' Mar 17 17:52:22.189791 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:52:22.192716 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:52:22.204431 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:52:22.210340 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:52:22.225400 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:52:22.243895 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:52:22.259512 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:52:22.272064 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:52:22.280748 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:52:22.297526 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:52:22.302049 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:52:22.311113 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:52:22.327690 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:52:22.352214 dracut-cmdline[292]: dracut-dracut-053 Mar 17 17:52:22.358862 dracut-cmdline[292]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=f8298a09e890fc732131b7281e24befaf65b596eb5216e969c8eca4cab4a2b3a Mar 17 17:52:22.402032 systemd-resolved[289]: Positive Trust Anchors: Mar 17 17:52:22.402072 systemd-resolved[289]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:52:22.402134 systemd-resolved[289]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:52:22.541190 kernel: SCSI subsystem initialized Mar 17 17:52:22.548187 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:52:22.561201 kernel: iscsi: registered transport (tcp) Mar 17 17:52:22.582641 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:52:22.582728 kernel: QLogic iSCSI HBA Driver Mar 17 17:52:22.643197 kernel: random: crng init done Mar 17 17:52:22.643478 systemd-resolved[289]: Defaulting to hostname 'linux'. Mar 17 17:52:22.646112 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:52:22.650382 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:52:22.674631 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:52:22.681450 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:52:22.718769 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:52:22.718845 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:52:22.718871 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:52:22.801180 kernel: raid6: neonx8 gen() 6590 MB/s Mar 17 17:52:22.802208 kernel: raid6: neonx4 gen() 6569 MB/s Mar 17 17:52:22.819188 kernel: raid6: neonx2 gen() 5444 MB/s Mar 17 17:52:22.836188 kernel: raid6: neonx1 gen() 3949 MB/s Mar 17 17:52:22.853187 kernel: raid6: int64x8 gen() 3637 MB/s Mar 17 17:52:22.870188 kernel: raid6: int64x4 gen() 3713 MB/s Mar 17 17:52:22.887187 kernel: raid6: int64x2 gen() 3609 MB/s Mar 17 17:52:22.904963 kernel: raid6: int64x1 gen() 2758 MB/s Mar 17 17:52:22.904995 kernel: raid6: using algorithm neonx8 gen() 6590 MB/s Mar 17 17:52:22.922981 kernel: raid6: .... xor() 4674 MB/s, rmw enabled Mar 17 17:52:22.923018 kernel: raid6: using neon recovery algorithm Mar 17 17:52:22.930192 kernel: xor: measuring software checksum speed Mar 17 17:52:22.930250 kernel: 8regs : 11896 MB/sec Mar 17 17:52:22.932187 kernel: 32regs : 12014 MB/sec Mar 17 17:52:22.934220 kernel: arm64_neon : 8987 MB/sec Mar 17 17:52:22.934264 kernel: xor: using function: 32regs (12014 MB/sec) Mar 17 17:52:23.016218 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:52:23.034844 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:52:23.047503 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:52:23.091907 systemd-udevd[473]: Using default interface naming scheme 'v255'. Mar 17 17:52:23.102938 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:52:23.124520 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:52:23.153251 dracut-pre-trigger[477]: rd.md=0: removing MD RAID activation Mar 17 17:52:23.208101 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:52:23.217430 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:52:23.338748 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:52:23.350783 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:52:23.400236 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:52:23.405758 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:52:23.418404 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:52:23.420693 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:52:23.442385 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:52:23.480670 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:52:23.515201 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 17 17:52:23.515271 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 17 17:52:23.546358 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 17 17:52:23.546610 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 17 17:52:23.546870 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:d4:73:1f:a3:6b Mar 17 17:52:23.571177 (udev-worker)[534]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:52:23.575924 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:52:23.578230 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:52:23.582494 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:52:23.587855 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:52:23.588174 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:52:23.594456 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:52:23.609663 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:52:23.616198 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 17 17:52:23.616236 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 17 17:52:23.612562 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:52:23.629202 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 17 17:52:23.637105 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 17:52:23.637197 kernel: GPT:9289727 != 16777215 Mar 17 17:52:23.637226 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 17:52:23.638567 kernel: GPT:9289727 != 16777215 Mar 17 17:52:23.638603 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 17:52:23.638628 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 17:52:23.649921 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:52:23.665419 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:52:23.691257 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:52:23.815197 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/nvme0n1p6 scanned by (udev-worker) (533) Mar 17 17:52:23.841240 kernel: BTRFS: device fsid 5ecee764-de70-4de1-8711-3798360e0d13 devid 1 transid 39 /dev/nvme0n1p3 scanned by (udev-worker) (520) Mar 17 17:52:23.868863 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Mar 17 17:52:23.926331 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Mar 17 17:52:23.953151 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 17 17:52:23.991643 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Mar 17 17:52:23.994662 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Mar 17 17:52:24.008417 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:52:24.023747 disk-uuid[663]: Primary Header is updated. Mar 17 17:52:24.023747 disk-uuid[663]: Secondary Entries is updated. Mar 17 17:52:24.023747 disk-uuid[663]: Secondary Header is updated. Mar 17 17:52:24.035201 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 17:52:25.051191 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 17:52:25.054448 disk-uuid[664]: The operation has completed successfully. Mar 17 17:52:25.244329 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:52:25.244547 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:52:25.337417 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:52:25.345343 sh[925]: Success Mar 17 17:52:25.369632 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:52:25.546037 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:52:25.562373 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:52:25.567220 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:52:25.594510 kernel: BTRFS info (device dm-0): first mount of filesystem 5ecee764-de70-4de1-8711-3798360e0d13 Mar 17 17:52:25.594576 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:52:25.596326 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:52:25.596366 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:52:25.597557 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:52:25.705211 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 17 17:52:25.758854 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:52:25.762301 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:52:25.782475 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:52:25.787414 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:52:25.829124 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:52:25.830255 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:52:25.830307 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 17:52:25.837199 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 17:52:25.857119 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:52:25.859474 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:52:25.869209 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:52:25.881567 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:52:25.957041 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:52:25.968464 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:52:26.028967 systemd-networkd[1118]: lo: Link UP Mar 17 17:52:26.028987 systemd-networkd[1118]: lo: Gained carrier Mar 17 17:52:26.032906 systemd-networkd[1118]: Enumeration completed Mar 17 17:52:26.034423 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:52:26.035958 systemd-networkd[1118]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:52:26.035966 systemd-networkd[1118]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:52:26.046732 systemd[1]: Reached target network.target - Network. Mar 17 17:52:26.050946 systemd-networkd[1118]: eth0: Link UP Mar 17 17:52:26.050958 systemd-networkd[1118]: eth0: Gained carrier Mar 17 17:52:26.050975 systemd-networkd[1118]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:52:26.080389 systemd-networkd[1118]: eth0: DHCPv4 address 172.31.27.21/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 17 17:52:26.351904 ignition[1050]: Ignition 2.20.0 Mar 17 17:52:26.351933 ignition[1050]: Stage: fetch-offline Mar 17 17:52:26.352393 ignition[1050]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:26.352417 ignition[1050]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:26.355078 ignition[1050]: Ignition finished successfully Mar 17 17:52:26.362029 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:52:26.370414 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 17:52:26.405917 ignition[1129]: Ignition 2.20.0 Mar 17 17:52:26.405946 ignition[1129]: Stage: fetch Mar 17 17:52:26.407546 ignition[1129]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:26.407572 ignition[1129]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:26.408655 ignition[1129]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:26.430946 ignition[1129]: PUT result: OK Mar 17 17:52:26.438843 ignition[1129]: parsed url from cmdline: "" Mar 17 17:52:26.438865 ignition[1129]: no config URL provided Mar 17 17:52:26.438880 ignition[1129]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:52:26.438906 ignition[1129]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:52:26.438940 ignition[1129]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:26.440610 ignition[1129]: PUT result: OK Mar 17 17:52:26.442596 ignition[1129]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 17 17:52:26.446646 ignition[1129]: GET result: OK Mar 17 17:52:26.453798 unknown[1129]: fetched base config from "system" Mar 17 17:52:26.447527 ignition[1129]: parsing config with SHA512: 8c4680920b327315c99fdaec05f8e763700618fbb6ee02d764b46da77490897f6dbc2ab7fedd154b5415736127885e9e5a79ff1b2eb6466cc006853f48e54c5b Mar 17 17:52:26.453815 unknown[1129]: fetched base config from "system" Mar 17 17:52:26.454258 ignition[1129]: fetch: fetch complete Mar 17 17:52:26.453828 unknown[1129]: fetched user config from "aws" Mar 17 17:52:26.454270 ignition[1129]: fetch: fetch passed Mar 17 17:52:26.454351 ignition[1129]: Ignition finished successfully Mar 17 17:52:26.467187 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 17:52:26.476473 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:52:26.510240 ignition[1135]: Ignition 2.20.0 Mar 17 17:52:26.510269 ignition[1135]: Stage: kargs Mar 17 17:52:26.511870 ignition[1135]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:26.511925 ignition[1135]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:26.512979 ignition[1135]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:26.519134 ignition[1135]: PUT result: OK Mar 17 17:52:26.523995 ignition[1135]: kargs: kargs passed Mar 17 17:52:26.524094 ignition[1135]: Ignition finished successfully Mar 17 17:52:26.529578 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:52:26.537513 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:52:26.567964 ignition[1141]: Ignition 2.20.0 Mar 17 17:52:26.567998 ignition[1141]: Stage: disks Mar 17 17:52:26.569613 ignition[1141]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:26.569639 ignition[1141]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:26.570576 ignition[1141]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:26.574697 ignition[1141]: PUT result: OK Mar 17 17:52:26.580602 ignition[1141]: disks: disks passed Mar 17 17:52:26.580695 ignition[1141]: Ignition finished successfully Mar 17 17:52:26.586243 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:52:26.591098 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:52:26.595268 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:52:26.597574 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:52:26.601474 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:52:26.605082 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:52:26.620464 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:52:26.660989 systemd-fsck[1149]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 17 17:52:26.668681 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:52:26.685475 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:52:26.766458 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 3914ef65-c5cd-468c-8ee7-964383d8e9e2 r/w with ordered data mode. Quota mode: none. Mar 17 17:52:26.767860 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:52:26.772348 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:52:26.811312 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:52:26.817375 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:52:26.822641 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:52:26.826204 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:52:26.826275 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:52:26.846749 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 scanned by mount (1168) Mar 17 17:52:26.850436 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:52:26.857593 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:52:26.857639 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:52:26.857665 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 17:52:26.866478 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:52:26.885189 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 17:52:26.888758 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:52:27.322259 initrd-setup-root[1192]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:52:27.331211 initrd-setup-root[1199]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:52:27.340078 initrd-setup-root[1206]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:52:27.349135 initrd-setup-root[1213]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:52:27.724278 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:52:27.734344 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:52:27.745523 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:52:27.762199 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:52:27.766190 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:52:27.796317 systemd-networkd[1118]: eth0: Gained IPv6LL Mar 17 17:52:27.796466 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:52:27.810117 ignition[1281]: INFO : Ignition 2.20.0 Mar 17 17:52:27.812795 ignition[1281]: INFO : Stage: mount Mar 17 17:52:27.812795 ignition[1281]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:27.812795 ignition[1281]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:27.812795 ignition[1281]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:27.820827 ignition[1281]: INFO : PUT result: OK Mar 17 17:52:27.824845 ignition[1281]: INFO : mount: mount passed Mar 17 17:52:27.827329 ignition[1281]: INFO : Ignition finished successfully Mar 17 17:52:27.831218 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:52:27.842448 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:52:27.858562 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:52:27.883470 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 scanned by mount (1292) Mar 17 17:52:27.883532 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:52:27.886400 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:52:27.886435 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 17:52:27.892189 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 17:52:27.896334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:52:27.930693 ignition[1309]: INFO : Ignition 2.20.0 Mar 17 17:52:27.930693 ignition[1309]: INFO : Stage: files Mar 17 17:52:27.934025 ignition[1309]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:27.934025 ignition[1309]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:27.934025 ignition[1309]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:27.940882 ignition[1309]: INFO : PUT result: OK Mar 17 17:52:27.948989 ignition[1309]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:52:27.971122 ignition[1309]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:52:27.971122 ignition[1309]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:52:28.020908 ignition[1309]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:52:28.023713 ignition[1309]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:52:28.026711 unknown[1309]: wrote ssh authorized keys file for user: core Mar 17 17:52:28.028811 ignition[1309]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:52:28.040720 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 17 17:52:28.046722 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-arm64.raw: attempt #1 Mar 17 17:52:28.408414 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Mar 17 17:52:28.796816 ignition[1309]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 17 17:52:28.800779 ignition[1309]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:52:28.800779 ignition[1309]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:52:28.800779 ignition[1309]: INFO : files: files passed Mar 17 17:52:28.800779 ignition[1309]: INFO : Ignition finished successfully Mar 17 17:52:28.812417 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:52:28.831519 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:52:28.839431 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:52:28.852646 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:52:28.852830 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:52:28.875933 initrd-setup-root-after-ignition[1337]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:52:28.875933 initrd-setup-root-after-ignition[1337]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:52:28.884212 initrd-setup-root-after-ignition[1341]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:52:28.891214 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:52:28.895389 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:52:28.913527 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:52:28.970493 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:52:28.972324 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:52:28.975400 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:52:28.979653 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:52:28.981659 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:52:29.007529 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:52:29.034535 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:52:29.048907 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:52:29.071408 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:52:29.075739 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:52:29.078133 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:52:29.080002 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:52:29.080272 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:52:29.082955 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:52:29.085050 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:52:29.086913 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:52:29.089117 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:52:29.091471 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:52:29.093756 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:52:29.095884 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:52:29.098357 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:52:29.100511 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:52:29.102547 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:52:29.104233 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:52:29.104458 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:52:29.106870 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:52:29.108997 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:52:29.111312 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:52:29.111544 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:52:29.114123 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:52:29.114369 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:52:29.117138 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:52:29.117399 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:52:29.128472 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:52:29.128681 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:52:29.151677 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:52:29.168940 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:52:29.181936 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:52:29.182537 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:52:29.189595 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:52:29.189830 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:52:29.210584 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:52:29.213232 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:52:29.234478 ignition[1361]: INFO : Ignition 2.20.0 Mar 17 17:52:29.234478 ignition[1361]: INFO : Stage: umount Mar 17 17:52:29.241481 ignition[1361]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:52:29.241481 ignition[1361]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 17:52:29.241481 ignition[1361]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 17:52:29.241481 ignition[1361]: INFO : PUT result: OK Mar 17 17:52:29.239948 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:52:29.263294 ignition[1361]: INFO : umount: umount passed Mar 17 17:52:29.263294 ignition[1361]: INFO : Ignition finished successfully Mar 17 17:52:29.252588 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:52:29.254326 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:52:29.270338 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:52:29.270567 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:52:29.277038 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:52:29.277135 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:52:29.279188 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:52:29.279313 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:52:29.282851 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 17:52:29.282955 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 17:52:29.284867 systemd[1]: Stopped target network.target - Network. Mar 17 17:52:29.286458 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:52:29.286541 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:52:29.288755 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:52:29.290417 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:52:29.295776 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:52:29.295886 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:52:29.301423 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:52:29.303302 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:52:29.303382 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:52:29.305263 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:52:29.305328 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:52:29.307290 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:52:29.307379 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:52:29.309296 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:52:29.309372 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:52:29.311940 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:52:29.312018 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:52:29.348063 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:52:29.350205 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:52:29.359342 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:52:29.359561 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:52:29.370457 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 17 17:52:29.374547 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:52:29.374784 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:52:29.382984 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 17 17:52:29.383427 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:52:29.383498 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:52:29.399105 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:52:29.399592 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:52:29.400925 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:52:29.401094 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:52:29.403240 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:52:29.405308 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:52:29.405408 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:52:29.405544 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:52:29.405618 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:52:29.428369 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:52:29.437294 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 17:52:29.437556 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:52:29.457102 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:52:29.457571 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:52:29.463558 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:52:29.464711 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:52:29.473124 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:52:29.473301 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:52:29.477265 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:52:29.481276 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:52:29.483303 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:52:29.483395 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:52:29.485524 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:52:29.485603 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:52:29.487797 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:52:29.487880 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:52:29.507438 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:52:29.511950 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:52:29.512072 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:52:29.515293 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 17 17:52:29.515384 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:52:29.524946 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:52:29.525047 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:52:29.534408 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:52:29.534613 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:52:29.543109 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 17:52:29.543276 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:52:29.557942 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:52:29.558338 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:52:29.565275 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:52:29.580482 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:52:29.597221 systemd[1]: Switching root. Mar 17 17:52:29.634686 systemd-journald[251]: Journal stopped Mar 17 17:52:32.342252 systemd-journald[251]: Received SIGTERM from PID 1 (systemd). Mar 17 17:52:32.342381 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:52:32.342431 kernel: SELinux: policy capability open_perms=1 Mar 17 17:52:32.342471 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:52:32.342500 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:52:32.342529 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:52:32.342559 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:52:32.342594 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:52:32.342623 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:52:32.342671 kernel: audit: type=1403 audit(1742233950.168:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:52:32.342705 systemd[1]: Successfully loaded SELinux policy in 89.296ms. Mar 17 17:52:32.342743 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 24.118ms. Mar 17 17:52:32.342775 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:52:32.342807 systemd[1]: Detected virtualization amazon. Mar 17 17:52:32.342838 systemd[1]: Detected architecture arm64. Mar 17 17:52:32.342885 systemd[1]: Detected first boot. Mar 17 17:52:32.342925 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:52:32.342956 zram_generator::config[1406]: No configuration found. Mar 17 17:52:32.342990 kernel: NET: Registered PF_VSOCK protocol family Mar 17 17:52:32.343019 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:52:32.343051 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 17 17:52:32.343081 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:52:32.343111 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:52:32.343144 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:52:32.343212 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:52:32.343248 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:52:32.343279 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:52:32.343307 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:52:32.343338 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:52:32.343368 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:52:32.343397 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:52:32.343427 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:52:32.343458 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:52:32.343492 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:52:32.343521 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:52:32.343564 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:52:32.343597 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:52:32.343628 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:52:32.343658 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 17 17:52:32.343689 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:52:32.343719 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:52:32.343754 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:52:32.343783 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:52:32.343814 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:52:32.343843 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:52:32.343875 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:52:32.343904 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:52:32.343934 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:52:32.343963 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:52:32.343994 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:52:32.344029 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 17 17:52:32.344058 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:52:32.344089 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:52:32.344117 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:52:32.344146 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:52:32.344235 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:52:32.344270 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:52:32.344299 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:52:32.344328 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:52:32.344363 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:52:32.344394 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:52:32.344427 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:52:32.344456 systemd[1]: Reached target machines.target - Containers. Mar 17 17:52:32.344486 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:52:32.344517 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:52:32.344545 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:52:32.344574 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:52:32.344609 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:52:32.344638 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:52:32.344667 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:52:32.344702 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:52:32.344733 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:52:32.344763 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:52:32.344794 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:52:32.344823 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:52:32.344856 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:52:32.344886 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:52:32.344917 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:52:32.344946 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:52:32.344974 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:52:32.345003 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:52:32.345031 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:52:32.345058 kernel: loop: module loaded Mar 17 17:52:32.345087 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 17 17:52:32.345123 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:52:32.345184 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:52:32.345220 systemd[1]: Stopped verity-setup.service. Mar 17 17:52:32.345249 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:52:32.345278 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:52:32.345312 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:52:32.345342 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:52:32.345371 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:52:32.345404 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:52:32.345433 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:52:32.345468 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:52:32.345497 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:52:32.345526 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:52:32.345555 kernel: fuse: init (API version 7.39) Mar 17 17:52:32.345583 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:52:32.345614 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:52:32.345642 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:52:32.345673 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:52:32.345702 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:52:32.345735 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:52:32.345768 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:52:32.345795 kernel: ACPI: bus type drm_connector registered Mar 17 17:52:32.345823 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:52:32.345851 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:52:32.345883 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:52:32.345923 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:52:32.345953 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:52:32.345981 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:52:32.346015 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:52:32.346046 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:52:32.346076 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 17 17:52:32.346105 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:52:32.346134 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:52:32.346194 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:52:32.346269 systemd-journald[1485]: Collecting audit messages is disabled. Mar 17 17:52:32.346322 systemd-journald[1485]: Journal started Mar 17 17:52:32.346371 systemd-journald[1485]: Runtime Journal (/run/log/journal/ec26bd0039ed3c383c98c74e154879c6) is 8M, max 75.3M, 67.3M free. Mar 17 17:52:31.671580 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:52:31.682453 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Mar 17 17:52:31.683336 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:52:32.357319 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:52:32.362214 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:52:32.373319 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:52:32.373408 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:52:32.394186 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:52:32.400342 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:52:32.411370 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:52:32.417095 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:52:32.422390 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:52:32.426632 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:52:32.429805 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 17 17:52:32.435170 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:52:32.437672 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:52:32.442873 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:52:32.458049 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:52:32.519862 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:52:32.522346 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:52:32.531445 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:52:32.544196 kernel: loop0: detected capacity change from 0 to 113512 Mar 17 17:52:32.558247 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 17 17:52:32.561917 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:52:32.597844 systemd-journald[1485]: Time spent on flushing to /var/log/journal/ec26bd0039ed3c383c98c74e154879c6 is 73.456ms for 909 entries. Mar 17 17:52:32.597844 systemd-journald[1485]: System Journal (/var/log/journal/ec26bd0039ed3c383c98c74e154879c6) is 8M, max 195.6M, 187.6M free. Mar 17 17:52:32.686834 systemd-journald[1485]: Received client request to flush runtime journal. Mar 17 17:52:32.607276 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 17 17:52:32.629403 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:52:32.641586 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:52:32.653002 systemd-tmpfiles[1523]: ACLs are not supported, ignoring. Mar 17 17:52:32.653026 systemd-tmpfiles[1523]: ACLs are not supported, ignoring. Mar 17 17:52:32.680281 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:52:32.700500 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:52:32.705948 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:52:32.707217 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:52:32.707828 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:52:32.720549 udevadm[1554]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 17:52:32.739579 kernel: loop1: detected capacity change from 0 to 201592 Mar 17 17:52:32.795669 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:52:32.807741 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:52:32.857057 systemd-tmpfiles[1563]: ACLs are not supported, ignoring. Mar 17 17:52:32.857589 systemd-tmpfiles[1563]: ACLs are not supported, ignoring. Mar 17 17:52:32.868799 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:52:33.057206 kernel: loop2: detected capacity change from 0 to 53784 Mar 17 17:52:33.191207 kernel: loop3: detected capacity change from 0 to 123192 Mar 17 17:52:33.325271 kernel: loop4: detected capacity change from 0 to 113512 Mar 17 17:52:33.347276 kernel: loop5: detected capacity change from 0 to 201592 Mar 17 17:52:33.386208 kernel: loop6: detected capacity change from 0 to 53784 Mar 17 17:52:33.400213 kernel: loop7: detected capacity change from 0 to 123192 Mar 17 17:52:33.420573 (sd-merge)[1569]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Mar 17 17:52:33.422324 (sd-merge)[1569]: Merged extensions into '/usr'. Mar 17 17:52:33.444395 systemd[1]: Reload requested from client PID 1522 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:52:33.444947 systemd[1]: Reloading... Mar 17 17:52:33.645213 zram_generator::config[1597]: No configuration found. Mar 17 17:52:33.902529 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:52:34.049236 systemd[1]: Reloading finished in 602 ms. Mar 17 17:52:34.073259 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:52:34.076608 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:52:34.092559 systemd[1]: Starting ensure-sysext.service... Mar 17 17:52:34.098499 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:52:34.108947 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:52:34.145287 systemd[1]: Reload requested from client PID 1649 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:52:34.145315 systemd[1]: Reloading... Mar 17 17:52:34.164585 systemd-tmpfiles[1650]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:52:34.165663 systemd-tmpfiles[1650]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:52:34.167807 systemd-tmpfiles[1650]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:52:34.168563 systemd-tmpfiles[1650]: ACLs are not supported, ignoring. Mar 17 17:52:34.168805 systemd-tmpfiles[1650]: ACLs are not supported, ignoring. Mar 17 17:52:34.180317 systemd-tmpfiles[1650]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:52:34.180673 systemd-tmpfiles[1650]: Skipping /boot Mar 17 17:52:34.206596 systemd-tmpfiles[1650]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:52:34.206779 systemd-tmpfiles[1650]: Skipping /boot Mar 17 17:52:34.275054 systemd-udevd[1651]: Using default interface naming scheme 'v255'. Mar 17 17:52:34.340195 ldconfig[1518]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:52:34.361212 zram_generator::config[1684]: No configuration found. Mar 17 17:52:34.567400 (udev-worker)[1723]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:52:34.720272 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:52:34.837186 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1702) Mar 17 17:52:34.900951 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 17 17:52:34.901960 systemd[1]: Reloading finished in 756 ms. Mar 17 17:52:34.919949 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:52:34.924733 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:52:34.929411 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:52:35.011252 systemd[1]: Finished ensure-sysext.service. Mar 17 17:52:35.041669 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:52:35.051355 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:52:35.053825 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:52:35.057764 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:52:35.062202 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:52:35.070457 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:52:35.077498 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:52:35.079699 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:52:35.079781 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:52:35.084316 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:52:35.092546 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:52:35.101396 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:52:35.103637 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:52:35.116553 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:52:35.124209 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:52:35.177470 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:52:35.243964 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:52:35.247274 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:52:35.268069 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:52:35.269279 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:52:35.272126 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:52:35.280612 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:52:35.295802 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:52:35.297340 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:52:35.303366 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:52:35.313695 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:52:35.316240 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:52:35.327778 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:52:35.345121 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 17 17:52:35.353206 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:52:35.372558 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:52:35.388451 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:52:35.394604 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:52:35.399442 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:52:35.412565 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:52:35.421943 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:52:35.456280 augenrules[1896]: No rules Mar 17 17:52:35.460752 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:52:35.461404 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:52:35.463794 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:52:35.486203 lvm[1885]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:52:35.487459 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:52:35.520904 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:52:35.544826 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:52:35.548575 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:52:35.558513 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:52:35.586198 lvm[1913]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:52:35.634297 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:52:35.659073 systemd-networkd[1855]: lo: Link UP Mar 17 17:52:35.659094 systemd-networkd[1855]: lo: Gained carrier Mar 17 17:52:35.662096 systemd-networkd[1855]: Enumeration completed Mar 17 17:52:35.662815 systemd-networkd[1855]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:52:35.662823 systemd-networkd[1855]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:52:35.663696 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:52:35.668927 systemd-networkd[1855]: eth0: Link UP Mar 17 17:52:35.673440 systemd-networkd[1855]: eth0: Gained carrier Mar 17 17:52:35.673492 systemd-networkd[1855]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:52:35.675464 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 17 17:52:35.684470 systemd-networkd[1855]: eth0: DHCPv4 address 172.31.27.21/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 17 17:52:35.688477 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:52:35.710048 systemd-resolved[1857]: Positive Trust Anchors: Mar 17 17:52:35.710090 systemd-resolved[1857]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:52:35.710189 systemd-resolved[1857]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:52:35.721257 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 17 17:52:35.728652 systemd-resolved[1857]: Defaulting to hostname 'linux'. Mar 17 17:52:35.732365 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:52:35.734666 systemd[1]: Reached target network.target - Network. Mar 17 17:52:35.736396 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:52:35.738580 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:52:35.740701 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:52:35.743022 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:52:35.745605 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:52:35.747842 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:52:35.750847 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:52:35.753186 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:52:35.753239 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:52:35.754933 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:52:35.758576 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:52:35.763661 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:52:35.770411 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 17 17:52:35.774063 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 17 17:52:35.777423 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 17 17:52:35.783361 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:52:35.786102 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 17 17:52:35.789513 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:52:35.791757 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:52:35.793644 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:52:35.795519 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:52:35.795579 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:52:35.806418 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:52:35.811380 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 17 17:52:35.817650 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:52:35.824625 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:52:35.839498 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:52:35.841468 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:52:35.845242 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:52:35.852495 systemd[1]: Started ntpd.service - Network Time Service. Mar 17 17:52:35.860395 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 17 17:52:35.863345 jq[1924]: false Mar 17 17:52:35.874427 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:52:35.881497 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:52:35.892601 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:52:35.897618 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:52:35.898540 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:52:35.902903 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:52:35.910074 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:52:35.918970 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:52:35.920542 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:52:35.945681 dbus-daemon[1923]: [system] SELinux support is enabled Mar 17 17:52:35.945983 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:52:35.952984 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:52:35.953051 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:52:35.955635 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:52:35.955671 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:52:35.989186 extend-filesystems[1925]: Found loop4 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found loop5 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found loop6 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found loop7 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p1 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p2 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p3 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found usr Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p4 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p6 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p7 Mar 17 17:52:35.989186 extend-filesystems[1925]: Found nvme0n1p9 Mar 17 17:52:35.988660 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:52:36.030175 extend-filesystems[1925]: Checking size of /dev/nvme0n1p9 Mar 17 17:52:36.025471 dbus-daemon[1923]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1855 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 17 17:52:35.989092 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:52:36.036405 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:52:36.037984 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:52:36.043478 (ntainerd)[1951]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:52:36.064902 jq[1935]: true Mar 17 17:52:36.075319 extend-filesystems[1925]: Resized partition /dev/nvme0n1p9 Mar 17 17:52:36.078343 extend-filesystems[1956]: resize2fs 1.47.1 (20-May-2024) Mar 17 17:52:36.080510 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 17 17:52:36.096208 update_engine[1934]: I20250317 17:52:36.096029 1934 main.cc:92] Flatcar Update Engine starting Mar 17 17:52:36.113690 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Mar 17 17:52:36.111809 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:52:36.114291 update_engine[1934]: I20250317 17:52:36.114219 1934 update_check_scheduler.cc:74] Next update check in 9m25s Mar 17 17:52:36.128556 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:52:36.172917 ntpd[1927]: ntpd 4.2.8p17@1.4004-o Mon Mar 17 15:34:16 UTC 2025 (1): Starting Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: ntpd 4.2.8p17@1.4004-o Mon Mar 17 15:34:16 UTC 2025 (1): Starting Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: ---------------------------------------------------- Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: ntp-4 is maintained by Network Time Foundation, Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: corporation. Support and training for ntp-4 are Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: available at https://www.nwtime.org/support Mar 17 17:52:36.175558 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: ---------------------------------------------------- Mar 17 17:52:36.172981 ntpd[1927]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 17 17:52:36.173002 ntpd[1927]: ---------------------------------------------------- Mar 17 17:52:36.173020 ntpd[1927]: ntp-4 is maintained by Network Time Foundation, Mar 17 17:52:36.173038 ntpd[1927]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 17 17:52:36.173055 ntpd[1927]: corporation. Support and training for ntp-4 are Mar 17 17:52:36.173072 ntpd[1927]: available at https://www.nwtime.org/support Mar 17 17:52:36.173089 ntpd[1927]: ---------------------------------------------------- Mar 17 17:52:36.189382 ntpd[1927]: proto: precision = 0.096 usec (-23) Mar 17 17:52:36.192331 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: proto: precision = 0.096 usec (-23) Mar 17 17:52:36.201469 ntpd[1927]: basedate set to 2025-03-05 Mar 17 17:52:36.201516 ntpd[1927]: gps base set to 2025-03-09 (week 2357) Mar 17 17:52:36.201683 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: basedate set to 2025-03-05 Mar 17 17:52:36.201683 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: gps base set to 2025-03-09 (week 2357) Mar 17 17:52:36.216997 ntpd[1927]: Listen and drop on 0 v6wildcard [::]:123 Mar 17 17:52:36.217104 ntpd[1927]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 17 17:52:36.217280 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listen and drop on 0 v6wildcard [::]:123 Mar 17 17:52:36.217280 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 17 17:52:36.224704 jq[1959]: true Mar 17 17:52:36.232989 ntpd[1927]: Listen normally on 2 lo 127.0.0.1:123 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listen normally on 2 lo 127.0.0.1:123 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listen normally on 3 eth0 172.31.27.21:123 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listen normally on 4 lo [::1]:123 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: bind(21) AF_INET6 fe80::4d4:73ff:fe1f:a36b%2#123 flags 0x11 failed: Cannot assign requested address Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: unable to create socket on eth0 (5) for fe80::4d4:73ff:fe1f:a36b%2#123 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: failed to init interface for address fe80::4d4:73ff:fe1f:a36b%2 Mar 17 17:52:36.238326 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: Listening on routing socket on fd #21 for interface updates Mar 17 17:52:36.233076 ntpd[1927]: Listen normally on 3 eth0 172.31.27.21:123 Mar 17 17:52:36.233140 ntpd[1927]: Listen normally on 4 lo [::1]:123 Mar 17 17:52:36.233264 ntpd[1927]: bind(21) AF_INET6 fe80::4d4:73ff:fe1f:a36b%2#123 flags 0x11 failed: Cannot assign requested address Mar 17 17:52:36.233304 ntpd[1927]: unable to create socket on eth0 (5) for fe80::4d4:73ff:fe1f:a36b%2#123 Mar 17 17:52:36.233331 ntpd[1927]: failed to init interface for address fe80::4d4:73ff:fe1f:a36b%2 Mar 17 17:52:36.233383 ntpd[1927]: Listening on routing socket on fd #21 for interface updates Mar 17 17:52:36.261376 ntpd[1927]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 17 17:52:36.261605 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 17 17:52:36.261781 ntpd[1927]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 17 17:52:36.261947 ntpd[1927]: 17 Mar 17:52:36 ntpd[1927]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 17 17:52:36.270220 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Mar 17 17:52:36.294062 extend-filesystems[1956]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 17 17:52:36.294062 extend-filesystems[1956]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 17:52:36.294062 extend-filesystems[1956]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Mar 17 17:52:36.295430 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 17 17:52:36.329564 extend-filesystems[1925]: Resized filesystem in /dev/nvme0n1p9 Mar 17 17:52:36.334853 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:52:36.337253 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:52:36.369860 coreos-metadata[1922]: Mar 17 17:52:36.368 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 17 17:52:36.372066 coreos-metadata[1922]: Mar 17 17:52:36.371 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Mar 17 17:52:36.379385 coreos-metadata[1922]: Mar 17 17:52:36.377 INFO Fetch successful Mar 17 17:52:36.379385 coreos-metadata[1922]: Mar 17 17:52:36.377 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Mar 17 17:52:36.379668 coreos-metadata[1922]: Mar 17 17:52:36.379 INFO Fetch successful Mar 17 17:52:36.379668 coreos-metadata[1922]: Mar 17 17:52:36.379 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Mar 17 17:52:36.380457 coreos-metadata[1922]: Mar 17 17:52:36.380 INFO Fetch successful Mar 17 17:52:36.380565 coreos-metadata[1922]: Mar 17 17:52:36.380 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Mar 17 17:52:36.381443 coreos-metadata[1922]: Mar 17 17:52:36.381 INFO Fetch successful Mar 17 17:52:36.381443 coreos-metadata[1922]: Mar 17 17:52:36.381 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Mar 17 17:52:36.392190 coreos-metadata[1922]: Mar 17 17:52:36.390 INFO Fetch failed with 404: resource not found Mar 17 17:52:36.392190 coreos-metadata[1922]: Mar 17 17:52:36.390 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Mar 17 17:52:36.392886 coreos-metadata[1922]: Mar 17 17:52:36.392 INFO Fetch successful Mar 17 17:52:36.392886 coreos-metadata[1922]: Mar 17 17:52:36.392 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Mar 17 17:52:36.393682 coreos-metadata[1922]: Mar 17 17:52:36.393 INFO Fetch successful Mar 17 17:52:36.393682 coreos-metadata[1922]: Mar 17 17:52:36.393 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Mar 17 17:52:36.399911 coreos-metadata[1922]: Mar 17 17:52:36.399 INFO Fetch successful Mar 17 17:52:36.399911 coreos-metadata[1922]: Mar 17 17:52:36.399 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Mar 17 17:52:36.400388 coreos-metadata[1922]: Mar 17 17:52:36.400 INFO Fetch successful Mar 17 17:52:36.400461 coreos-metadata[1922]: Mar 17 17:52:36.400 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Mar 17 17:52:36.401191 coreos-metadata[1922]: Mar 17 17:52:36.401 INFO Fetch successful Mar 17 17:52:36.449341 bash[2000]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:52:36.454839 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:52:36.468559 systemd[1]: Starting sshkeys.service... Mar 17 17:52:36.490203 systemd-logind[1931]: Watching system buttons on /dev/input/event0 (Power Button) Mar 17 17:52:36.490285 systemd-logind[1931]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 17 17:52:36.491282 systemd-logind[1931]: New seat seat0. Mar 17 17:52:36.493429 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:52:36.515186 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1722) Mar 17 17:52:36.553054 locksmithd[1965]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:52:36.558462 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 17 17:52:36.571446 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 17 17:52:36.612265 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 17 17:52:36.615920 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:52:36.712377 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:52:36.787942 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 17 17:52:36.803396 dbus-daemon[1923]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 17 17:52:36.807273 dbus-daemon[1923]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1957 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 17 17:52:36.844329 systemd[1]: Starting polkit.service - Authorization Manager... Mar 17 17:52:36.873480 polkitd[2086]: Started polkitd version 121 Mar 17 17:52:36.886571 containerd[1951]: time="2025-03-17T17:52:36.884232744Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:52:36.892150 polkitd[2086]: Loading rules from directory /etc/polkit-1/rules.d Mar 17 17:52:36.892354 polkitd[2086]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 17 17:52:36.893122 polkitd[2086]: Finished loading, compiling and executing 2 rules Mar 17 17:52:36.898375 dbus-daemon[1923]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 17 17:52:36.898671 systemd[1]: Started polkit.service - Authorization Manager. Mar 17 17:52:36.903381 polkitd[2086]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 17 17:52:36.954276 coreos-metadata[2040]: Mar 17 17:52:36.953 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 17 17:52:36.960401 systemd-resolved[1857]: System hostname changed to 'ip-172-31-27-21'. Mar 17 17:52:36.961642 coreos-metadata[2040]: Mar 17 17:52:36.957 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Mar 17 17:52:36.960510 systemd-hostnamed[1957]: Hostname set to (transient) Mar 17 17:52:36.966194 coreos-metadata[2040]: Mar 17 17:52:36.964 INFO Fetch successful Mar 17 17:52:36.966194 coreos-metadata[2040]: Mar 17 17:52:36.964 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 17 17:52:36.966896 coreos-metadata[2040]: Mar 17 17:52:36.966 INFO Fetch successful Mar 17 17:52:36.973293 unknown[2040]: wrote ssh authorized keys file for user: core Mar 17 17:52:37.022402 update-ssh-keys[2118]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:52:37.026238 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 17 17:52:37.032587 systemd[1]: Finished sshkeys.service. Mar 17 17:52:37.081927 containerd[1951]: time="2025-03-17T17:52:37.081790665Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.088852 containerd[1951]: time="2025-03-17T17:52:37.088787781Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:52:37.089007 containerd[1951]: time="2025-03-17T17:52:37.088979901Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:52:37.089115 containerd[1951]: time="2025-03-17T17:52:37.089088573Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:52:37.089545 containerd[1951]: time="2025-03-17T17:52:37.089513505Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:52:37.089658 containerd[1951]: time="2025-03-17T17:52:37.089631489Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.089857 containerd[1951]: time="2025-03-17T17:52:37.089826093Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:52:37.089953 containerd[1951]: time="2025-03-17T17:52:37.089927793Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.090408 containerd[1951]: time="2025-03-17T17:52:37.090373173Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:52:37.090517 containerd[1951]: time="2025-03-17T17:52:37.090490341Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.090668 containerd[1951]: time="2025-03-17T17:52:37.090634641Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:52:37.090765 containerd[1951]: time="2025-03-17T17:52:37.090739557Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.091006 containerd[1951]: time="2025-03-17T17:52:37.090977469Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.091520 containerd[1951]: time="2025-03-17T17:52:37.091488021Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:52:37.091848 containerd[1951]: time="2025-03-17T17:52:37.091816689Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:52:37.091948 containerd[1951]: time="2025-03-17T17:52:37.091922877Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:52:37.092222 containerd[1951]: time="2025-03-17T17:52:37.092194797Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:52:37.092398 containerd[1951]: time="2025-03-17T17:52:37.092372001Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:52:37.096773 containerd[1951]: time="2025-03-17T17:52:37.096726861Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:52:37.097001 containerd[1951]: time="2025-03-17T17:52:37.096972561Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:52:37.097308 containerd[1951]: time="2025-03-17T17:52:37.097269405Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:52:37.098260 containerd[1951]: time="2025-03-17T17:52:37.098215617Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:52:37.098316 containerd[1951]: time="2025-03-17T17:52:37.098284305Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:52:37.100256 containerd[1951]: time="2025-03-17T17:52:37.100208721Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:52:37.100834 containerd[1951]: time="2025-03-17T17:52:37.100796301Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:52:37.102283 containerd[1951]: time="2025-03-17T17:52:37.102229341Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:52:37.102439 containerd[1951]: time="2025-03-17T17:52:37.102364197Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:52:37.102590 containerd[1951]: time="2025-03-17T17:52:37.102408249Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:52:37.102590 containerd[1951]: time="2025-03-17T17:52:37.102534237Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.102795 containerd[1951]: time="2025-03-17T17:52:37.102567957Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.102795 containerd[1951]: time="2025-03-17T17:52:37.102746541Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103092 containerd[1951]: time="2025-03-17T17:52:37.102924897Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103092 containerd[1951]: time="2025-03-17T17:52:37.102971277Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103092 containerd[1951]: time="2025-03-17T17:52:37.103028001Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103451 containerd[1951]: time="2025-03-17T17:52:37.103289097Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103451 containerd[1951]: time="2025-03-17T17:52:37.103326585Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:52:37.103451 containerd[1951]: time="2025-03-17T17:52:37.103392945Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103686 containerd[1951]: time="2025-03-17T17:52:37.103427121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103686 containerd[1951]: time="2025-03-17T17:52:37.103636845Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103975 containerd[1951]: time="2025-03-17T17:52:37.103807605Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103975 containerd[1951]: time="2025-03-17T17:52:37.103847277Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103975 containerd[1951]: time="2025-03-17T17:52:37.103903197Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.103975 containerd[1951]: time="2025-03-17T17:52:37.103933641Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104347 containerd[1951]: time="2025-03-17T17:52:37.104188725Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104347 containerd[1951]: time="2025-03-17T17:52:37.104230797Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104347 containerd[1951]: time="2025-03-17T17:52:37.104291025Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104593 containerd[1951]: time="2025-03-17T17:52:37.104327061Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104593 containerd[1951]: time="2025-03-17T17:52:37.104542101Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104795 containerd[1951]: time="2025-03-17T17:52:37.104684757Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.104795 containerd[1951]: time="2025-03-17T17:52:37.104723241Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:52:37.105097 containerd[1951]: time="2025-03-17T17:52:37.104930313Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.105097 containerd[1951]: time="2025-03-17T17:52:37.104970189Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.105097 containerd[1951]: time="2025-03-17T17:52:37.105027165Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:52:37.105615 containerd[1951]: time="2025-03-17T17:52:37.105397881Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:52:37.105615 containerd[1951]: time="2025-03-17T17:52:37.105474045Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:52:37.105810 containerd[1951]: time="2025-03-17T17:52:37.105502353Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:52:37.105810 containerd[1951]: time="2025-03-17T17:52:37.105762873Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:52:37.105999 containerd[1951]: time="2025-03-17T17:52:37.105944637Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.106169 containerd[1951]: time="2025-03-17T17:52:37.106085277Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:52:37.106169 containerd[1951]: time="2025-03-17T17:52:37.106117953Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:52:37.106443 containerd[1951]: time="2025-03-17T17:52:37.106272957Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:52:37.107200 containerd[1951]: time="2025-03-17T17:52:37.107073921Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:52:37.107675 containerd[1951]: time="2025-03-17T17:52:37.107354253Z" level=info msg="Connect containerd service" Mar 17 17:52:37.107675 containerd[1951]: time="2025-03-17T17:52:37.107419881Z" level=info msg="using legacy CRI server" Mar 17 17:52:37.107675 containerd[1951]: time="2025-03-17T17:52:37.107437653Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.108039753Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109081401Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109606065Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109692453Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109784937Z" level=info msg="Start subscribing containerd event" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109852857Z" level=info msg="Start recovering state" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109966065Z" level=info msg="Start event monitor" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.109989285Z" level=info msg="Start snapshots syncer" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.110012793Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:52:37.110183 containerd[1951]: time="2025-03-17T17:52:37.110033001Z" level=info msg="Start streaming server" Mar 17 17:52:37.111329 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:52:37.115837 containerd[1951]: time="2025-03-17T17:52:37.115786641Z" level=info msg="containerd successfully booted in 0.231824s" Mar 17 17:52:37.181130 ntpd[1927]: bind(24) AF_INET6 fe80::4d4:73ff:fe1f:a36b%2#123 flags 0x11 failed: Cannot assign requested address Mar 17 17:52:37.181676 ntpd[1927]: 17 Mar 17:52:37 ntpd[1927]: bind(24) AF_INET6 fe80::4d4:73ff:fe1f:a36b%2#123 flags 0x11 failed: Cannot assign requested address Mar 17 17:52:37.181764 ntpd[1927]: unable to create socket on eth0 (6) for fe80::4d4:73ff:fe1f:a36b%2#123 Mar 17 17:52:37.181930 ntpd[1927]: 17 Mar 17:52:37 ntpd[1927]: unable to create socket on eth0 (6) for fe80::4d4:73ff:fe1f:a36b%2#123 Mar 17 17:52:37.182002 ntpd[1927]: failed to init interface for address fe80::4d4:73ff:fe1f:a36b%2 Mar 17 17:52:37.182136 ntpd[1927]: 17 Mar 17:52:37 ntpd[1927]: failed to init interface for address fe80::4d4:73ff:fe1f:a36b%2 Mar 17 17:52:37.587359 systemd-networkd[1855]: eth0: Gained IPv6LL Mar 17 17:52:37.592666 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:52:37.596854 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:52:37.608576 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Mar 17 17:52:37.624179 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:52:37.639353 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:52:37.703394 amazon-ssm-agent[2126]: Initializing new seelog logger Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: New Seelog Logger Creation Complete Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 processing appconfig overrides Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705176 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 processing appconfig overrides Mar 17 17:52:37.705498 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705498 amazon-ssm-agent[2126]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.705498 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 processing appconfig overrides Mar 17 17:52:37.706414 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO Proxy environment variables: Mar 17 17:52:37.710197 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.710197 amazon-ssm-agent[2126]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 17:52:37.710197 amazon-ssm-agent[2126]: 2025/03/17 17:52:37 processing appconfig overrides Mar 17 17:52:37.745257 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:52:37.807237 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO no_proxy: Mar 17 17:52:37.905709 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO https_proxy: Mar 17 17:52:38.004278 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO http_proxy: Mar 17 17:52:38.105315 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO Checking if agent identity type OnPrem can be assumed Mar 17 17:52:38.203237 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO Checking if agent identity type EC2 can be assumed Mar 17 17:52:38.304696 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO Agent will take identity from EC2 Mar 17 17:52:38.323390 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 17 17:52:38.323780 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 17 17:52:38.323780 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 17 17:52:38.323780 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Mar 17 17:52:38.323780 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] Starting Core Agent Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [amazon-ssm-agent] registrar detected. Attempting registration Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [Registrar] Starting registrar module Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:37 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:38 INFO [EC2Identity] EC2 registration was successful. Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:38 INFO [CredentialRefresher] credentialRefresher has started Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:38 INFO [CredentialRefresher] Starting credentials refresher loop Mar 17 17:52:38.324233 amazon-ssm-agent[2126]: 2025-03-17 17:52:38 INFO EC2RoleProvider Successfully connected with instance profile role credentials Mar 17 17:52:38.404715 amazon-ssm-agent[2126]: 2025-03-17 17:52:38 INFO [CredentialRefresher] Next credential rotation will be in 30.133305955033332 minutes Mar 17 17:52:38.632177 sshd_keygen[1967]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:52:38.671720 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:52:38.681658 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:52:38.695181 systemd[1]: Started sshd@0-172.31.27.21:22-147.75.109.163:38972.service - OpenSSH per-connection server daemon (147.75.109.163:38972). Mar 17 17:52:38.712700 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:52:38.714315 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:52:38.729489 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:52:38.763434 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:52:38.777925 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:52:38.790753 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 17 17:52:38.793234 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:52:38.990927 sshd[2154]: Accepted publickey for core from 147.75.109.163 port 38972 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:38.993399 sshd-session[2154]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:39.006983 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:52:39.018653 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:52:39.039540 systemd-logind[1931]: New session 1 of user core. Mar 17 17:52:39.050315 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:52:39.062732 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:52:39.084584 (systemd)[2165]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:52:39.092209 systemd-logind[1931]: New session c1 of user core. Mar 17 17:52:39.366272 amazon-ssm-agent[2126]: 2025-03-17 17:52:39 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Mar 17 17:52:39.385437 systemd[2165]: Queued start job for default target default.target. Mar 17 17:52:39.394779 systemd[2165]: Created slice app.slice - User Application Slice. Mar 17 17:52:39.394846 systemd[2165]: Reached target paths.target - Paths. Mar 17 17:52:39.394934 systemd[2165]: Reached target timers.target - Timers. Mar 17 17:52:39.398623 systemd[2165]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:52:39.439133 systemd[2165]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:52:39.439413 systemd[2165]: Reached target sockets.target - Sockets. Mar 17 17:52:39.439499 systemd[2165]: Reached target basic.target - Basic System. Mar 17 17:52:39.439588 systemd[2165]: Reached target default.target - Main User Target. Mar 17 17:52:39.439648 systemd[2165]: Startup finished in 333ms. Mar 17 17:52:39.440404 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:52:39.452521 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:52:39.468308 amazon-ssm-agent[2126]: 2025-03-17 17:52:39 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2172) started Mar 17 17:52:39.570379 amazon-ssm-agent[2126]: 2025-03-17 17:52:39 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Mar 17 17:52:39.620151 systemd[1]: Started sshd@1-172.31.27.21:22-147.75.109.163:38976.service - OpenSSH per-connection server daemon (147.75.109.163:38976). Mar 17 17:52:39.833740 sshd[2186]: Accepted publickey for core from 147.75.109.163 port 38976 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:39.836675 sshd-session[2186]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:39.844255 systemd-logind[1931]: New session 2 of user core. Mar 17 17:52:39.851455 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:52:39.983672 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:39.987098 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:52:39.990628 sshd-session[2186]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:39.992343 sshd[2189]: Connection closed by 147.75.109.163 port 38976 Mar 17 17:52:39.993111 systemd[1]: Startup finished in 1.071s (kernel) + 8.348s (initrd) + 9.912s (userspace) = 19.332s. Mar 17 17:52:40.001857 (kubelet)[2196]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:52:40.007477 systemd[1]: sshd@1-172.31.27.21:22-147.75.109.163:38976.service: Deactivated successfully. Mar 17 17:52:40.025648 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 17:52:40.030009 systemd-logind[1931]: Session 2 logged out. Waiting for processes to exit. Mar 17 17:52:40.066685 systemd[1]: Started sshd@2-172.31.27.21:22-147.75.109.163:38978.service - OpenSSH per-connection server daemon (147.75.109.163:38978). Mar 17 17:52:40.069018 systemd-logind[1931]: Removed session 2. Mar 17 17:52:40.181084 ntpd[1927]: Listen normally on 7 eth0 [fe80::4d4:73ff:fe1f:a36b%2]:123 Mar 17 17:52:40.181745 ntpd[1927]: 17 Mar 17:52:40 ntpd[1927]: Listen normally on 7 eth0 [fe80::4d4:73ff:fe1f:a36b%2]:123 Mar 17 17:52:40.249305 sshd[2204]: Accepted publickey for core from 147.75.109.163 port 38978 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:40.253109 sshd-session[2204]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:40.263213 systemd-logind[1931]: New session 3 of user core. Mar 17 17:52:40.271494 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:52:40.397192 sshd[2211]: Connection closed by 147.75.109.163 port 38978 Mar 17 17:52:40.398110 sshd-session[2204]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:40.404824 systemd[1]: sshd@2-172.31.27.21:22-147.75.109.163:38978.service: Deactivated successfully. Mar 17 17:52:40.410452 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 17:52:40.413472 systemd-logind[1931]: Session 3 logged out. Waiting for processes to exit. Mar 17 17:52:40.415655 systemd-logind[1931]: Removed session 3. Mar 17 17:52:41.139017 kubelet[2196]: E0317 17:52:41.138920 2196 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:52:41.143576 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:52:41.143944 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:52:41.145360 systemd[1]: kubelet.service: Consumed 1.287s CPU time, 251.5M memory peak. Mar 17 17:52:43.596942 systemd-resolved[1857]: Clock change detected. Flushing caches. Mar 17 17:52:50.854186 systemd[1]: Started sshd@3-172.31.27.21:22-147.75.109.163:45724.service - OpenSSH per-connection server daemon (147.75.109.163:45724). Mar 17 17:52:51.025310 sshd[2220]: Accepted publickey for core from 147.75.109.163 port 45724 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:51.027708 sshd-session[2220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:51.035781 systemd-logind[1931]: New session 4 of user core. Mar 17 17:52:51.044934 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:52:51.166113 sshd[2222]: Connection closed by 147.75.109.163 port 45724 Mar 17 17:52:51.167234 sshd-session[2220]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:51.174046 systemd-logind[1931]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:52:51.174547 systemd[1]: sshd@3-172.31.27.21:22-147.75.109.163:45724.service: Deactivated successfully. Mar 17 17:52:51.178927 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:52:51.180487 systemd-logind[1931]: Removed session 4. Mar 17 17:52:51.201160 systemd[1]: Started sshd@4-172.31.27.21:22-147.75.109.163:45726.service - OpenSSH per-connection server daemon (147.75.109.163:45726). Mar 17 17:52:51.388515 sshd[2228]: Accepted publickey for core from 147.75.109.163 port 45726 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:51.390922 sshd-session[2228]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:51.399562 systemd-logind[1931]: New session 5 of user core. Mar 17 17:52:51.406971 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:52:51.523215 sshd[2230]: Connection closed by 147.75.109.163 port 45726 Mar 17 17:52:51.523986 sshd-session[2228]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:51.528642 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:52:51.529835 systemd[1]: sshd@4-172.31.27.21:22-147.75.109.163:45726.service: Deactivated successfully. Mar 17 17:52:51.534910 systemd-logind[1931]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:52:51.536540 systemd-logind[1931]: Removed session 5. Mar 17 17:52:51.566180 systemd[1]: Started sshd@5-172.31.27.21:22-147.75.109.163:45730.service - OpenSSH per-connection server daemon (147.75.109.163:45730). Mar 17 17:52:51.568103 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:52:51.573102 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:52:51.745482 sshd[2236]: Accepted publickey for core from 147.75.109.163 port 45730 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:51.748033 sshd-session[2236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:51.759791 systemd-logind[1931]: New session 6 of user core. Mar 17 17:52:51.766009 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:52:51.894572 sshd[2241]: Connection closed by 147.75.109.163 port 45730 Mar 17 17:52:51.895442 sshd-session[2236]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:51.901415 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:52:51.901847 systemd-logind[1931]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:52:51.905453 systemd[1]: sshd@5-172.31.27.21:22-147.75.109.163:45730.service: Deactivated successfully. Mar 17 17:52:51.913302 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:51.919435 (kubelet)[2248]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:52:51.932404 systemd-logind[1931]: Removed session 6. Mar 17 17:52:51.938942 systemd[1]: Started sshd@6-172.31.27.21:22-147.75.109.163:45736.service - OpenSSH per-connection server daemon (147.75.109.163:45736). Mar 17 17:52:52.001483 kubelet[2248]: E0317 17:52:52.001352 2248 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:52:52.008112 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:52:52.008623 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:52:52.009296 systemd[1]: kubelet.service: Consumed 299ms CPU time, 102.3M memory peak. Mar 17 17:52:52.130870 sshd[2256]: Accepted publickey for core from 147.75.109.163 port 45736 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:52.133170 sshd-session[2256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:52.141237 systemd-logind[1931]: New session 7 of user core. Mar 17 17:52:52.150947 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:52:52.289643 sudo[2262]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:52:52.290339 sudo[2262]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:52:52.306319 sudo[2262]: pam_unix(sudo:session): session closed for user root Mar 17 17:52:52.329136 sshd[2261]: Connection closed by 147.75.109.163 port 45736 Mar 17 17:52:52.330285 sshd-session[2256]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:52.336164 systemd[1]: sshd@6-172.31.27.21:22-147.75.109.163:45736.service: Deactivated successfully. Mar 17 17:52:52.339538 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:52:52.343071 systemd-logind[1931]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:52:52.345316 systemd-logind[1931]: Removed session 7. Mar 17 17:52:52.378453 systemd[1]: Started sshd@7-172.31.27.21:22-147.75.109.163:45738.service - OpenSSH per-connection server daemon (147.75.109.163:45738). Mar 17 17:52:52.557213 sshd[2268]: Accepted publickey for core from 147.75.109.163 port 45738 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:52.559808 sshd-session[2268]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:52.570088 systemd-logind[1931]: New session 8 of user core. Mar 17 17:52:52.583955 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:52:52.689186 sudo[2272]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:52:52.689855 sudo[2272]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:52:52.695964 sudo[2272]: pam_unix(sudo:session): session closed for user root Mar 17 17:52:52.705962 sudo[2271]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:52:52.706581 sudo[2271]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:52:52.729416 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:52:52.783436 augenrules[2294]: No rules Mar 17 17:52:52.785502 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:52:52.785983 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:52:52.789010 sudo[2271]: pam_unix(sudo:session): session closed for user root Mar 17 17:52:52.812404 sshd[2270]: Connection closed by 147.75.109.163 port 45738 Mar 17 17:52:52.815360 sshd-session[2268]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:52.819772 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:52:52.821214 systemd[1]: sshd@7-172.31.27.21:22-147.75.109.163:45738.service: Deactivated successfully. Mar 17 17:52:52.827104 systemd-logind[1931]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:52:52.828923 systemd-logind[1931]: Removed session 8. Mar 17 17:52:52.855200 systemd[1]: Started sshd@8-172.31.27.21:22-147.75.109.163:45740.service - OpenSSH per-connection server daemon (147.75.109.163:45740). Mar 17 17:52:53.043796 sshd[2303]: Accepted publickey for core from 147.75.109.163 port 45740 ssh2: RSA SHA256:jWEtc9YqmtsTw64J9pPbetIEjzOA+BC+HxLjpNhjZC0 Mar 17 17:52:53.046126 sshd-session[2303]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:52:53.054898 systemd-logind[1931]: New session 9 of user core. Mar 17 17:52:53.062945 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:52:53.167001 sudo[2306]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:52:53.167626 sudo[2306]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:52:54.176273 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:54.177397 systemd[1]: kubelet.service: Consumed 299ms CPU time, 102.3M memory peak. Mar 17 17:52:54.186211 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:52:54.247331 systemd[1]: Reload requested from client PID 2339 ('systemctl') (unit session-9.scope)... Mar 17 17:52:54.247556 systemd[1]: Reloading... Mar 17 17:52:54.520756 zram_generator::config[2387]: No configuration found. Mar 17 17:52:54.744056 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:52:54.967116 systemd[1]: Reloading finished in 718 ms. Mar 17 17:52:55.061356 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:55.078295 (kubelet)[2438]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:52:55.081594 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:52:55.082839 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:52:55.083366 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:55.083448 systemd[1]: kubelet.service: Consumed 220ms CPU time, 90.1M memory peak. Mar 17 17:52:55.094285 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:52:55.418957 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:52:55.432229 (kubelet)[2450]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:52:55.507468 kubelet[2450]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:52:55.507468 kubelet[2450]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 17 17:52:55.507468 kubelet[2450]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:52:55.508055 kubelet[2450]: I0317 17:52:55.507577 2450 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:52:57.173644 kubelet[2450]: I0317 17:52:57.173593 2450 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 17 17:52:57.175724 kubelet[2450]: I0317 17:52:57.174244 2450 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:52:57.175724 kubelet[2450]: I0317 17:52:57.174726 2450 server.go:954] "Client rotation is on, will bootstrap in background" Mar 17 17:52:57.230719 kubelet[2450]: I0317 17:52:57.230650 2450 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:52:57.242745 kubelet[2450]: E0317 17:52:57.242617 2450 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 17:52:57.242745 kubelet[2450]: I0317 17:52:57.242711 2450 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 17:52:57.247945 kubelet[2450]: I0317 17:52:57.247905 2450 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:52:57.250100 kubelet[2450]: I0317 17:52:57.249976 2450 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:52:57.250712 kubelet[2450]: I0317 17:52:57.250055 2450 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"172.31.27.21","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 17:52:57.250712 kubelet[2450]: I0317 17:52:57.250390 2450 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:52:57.250712 kubelet[2450]: I0317 17:52:57.250413 2450 container_manager_linux.go:304] "Creating device plugin manager" Mar 17 17:52:57.251062 kubelet[2450]: I0317 17:52:57.250740 2450 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:52:57.258613 kubelet[2450]: I0317 17:52:57.258576 2450 kubelet.go:446] "Attempting to sync node with API server" Mar 17 17:52:57.260392 kubelet[2450]: I0317 17:52:57.259891 2450 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:52:57.260392 kubelet[2450]: I0317 17:52:57.259938 2450 kubelet.go:352] "Adding apiserver pod source" Mar 17 17:52:57.260392 kubelet[2450]: I0317 17:52:57.259962 2450 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:52:57.262877 kubelet[2450]: E0317 17:52:57.262819 2450 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:52:57.263077 kubelet[2450]: E0317 17:52:57.262912 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:52:57.266747 kubelet[2450]: I0317 17:52:57.265963 2450 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:52:57.267099 kubelet[2450]: I0317 17:52:57.267055 2450 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:52:57.267298 kubelet[2450]: W0317 17:52:57.267278 2450 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:52:57.268550 kubelet[2450]: I0317 17:52:57.268520 2450 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 17 17:52:57.268794 kubelet[2450]: I0317 17:52:57.268764 2450 server.go:1287] "Started kubelet" Mar 17 17:52:57.269879 kubelet[2450]: I0317 17:52:57.269808 2450 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:52:57.272087 kubelet[2450]: I0317 17:52:57.271425 2450 server.go:490] "Adding debug handlers to kubelet server" Mar 17 17:52:57.277324 kubelet[2450]: I0317 17:52:57.277235 2450 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:52:57.277916 kubelet[2450]: I0317 17:52:57.277886 2450 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:52:57.279851 kubelet[2450]: I0317 17:52:57.279805 2450 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:52:57.283081 kubelet[2450]: I0317 17:52:57.283047 2450 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 17:52:57.288518 kubelet[2450]: E0317 17:52:57.288428 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.288796 kubelet[2450]: I0317 17:52:57.288768 2450 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 17 17:52:57.289206 kubelet[2450]: I0317 17:52:57.289178 2450 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:52:57.289428 kubelet[2450]: I0317 17:52:57.289407 2450 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:52:57.299000 kubelet[2450]: E0317 17:52:57.298947 2450 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:52:57.300994 kubelet[2450]: I0317 17:52:57.300680 2450 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:52:57.302717 kubelet[2450]: E0317 17:52:57.301084 2450 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.27.21.182da89252b7f2df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.27.21,UID:172.31.27.21,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:172.31.27.21,},FirstTimestamp:2025-03-17 17:52:57.268728543 +0000 UTC m=+1.829354686,LastTimestamp:2025-03-17 17:52:57.268728543 +0000 UTC m=+1.829354686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.27.21,}" Mar 17 17:52:57.302717 kubelet[2450]: I0317 17:52:57.301260 2450 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:52:57.303150 kubelet[2450]: W0317 17:52:57.301654 2450 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "172.31.27.21" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 17 17:52:57.303391 kubelet[2450]: E0317 17:52:57.303359 2450 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"172.31.27.21\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 17:52:57.303544 kubelet[2450]: W0317 17:52:57.301954 2450 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 17 17:52:57.303702 kubelet[2450]: E0317 17:52:57.303656 2450 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 17:52:57.310570 kubelet[2450]: I0317 17:52:57.310533 2450 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:52:57.337042 kubelet[2450]: I0317 17:52:57.336876 2450 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 17 17:52:57.337042 kubelet[2450]: I0317 17:52:57.336910 2450 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 17 17:52:57.337042 kubelet[2450]: I0317 17:52:57.336946 2450 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:52:57.339704 kubelet[2450]: E0317 17:52:57.339463 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"172.31.27.21\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 17 17:52:57.339704 kubelet[2450]: W0317 17:52:57.339501 2450 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 17 17:52:57.339704 kubelet[2450]: E0317 17:52:57.339549 2450 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 17:52:57.339704 kubelet[2450]: E0317 17:52:57.339550 2450 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.27.21.182da8925481a43c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.27.21,UID:172.31.27.21,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:172.31.27.21,},FirstTimestamp:2025-03-17 17:52:57.2987239 +0000 UTC m=+1.859350031,LastTimestamp:2025-03-17 17:52:57.2987239 +0000 UTC m=+1.859350031,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.27.21,}" Mar 17 17:52:57.341108 kubelet[2450]: I0317 17:52:57.340552 2450 policy_none.go:49] "None policy: Start" Mar 17 17:52:57.341108 kubelet[2450]: I0317 17:52:57.340603 2450 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 17 17:52:57.341108 kubelet[2450]: I0317 17:52:57.340632 2450 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:52:57.350547 kubelet[2450]: E0317 17:52:57.350403 2450 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.27.21.182da892569d4fb8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.27.21,UID:172.31.27.21,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node 172.31.27.21 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:172.31.27.21,},FirstTimestamp:2025-03-17 17:52:57.334091704 +0000 UTC m=+1.894717823,LastTimestamp:2025-03-17 17:52:57.334091704 +0000 UTC m=+1.894717823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.27.21,}" Mar 17 17:52:57.358404 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:52:57.381232 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:52:57.384118 kubelet[2450]: E0317 17:52:57.383665 2450 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.27.21.182da892569d9158 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.27.21,UID:172.31.27.21,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node 172.31.27.21 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:172.31.27.21,},FirstTimestamp:2025-03-17 17:52:57.334108504 +0000 UTC m=+1.894734623,LastTimestamp:2025-03-17 17:52:57.334108504 +0000 UTC m=+1.894734623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.27.21,}" Mar 17 17:52:57.389931 kubelet[2450]: E0317 17:52:57.389870 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.392233 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:52:57.400428 kubelet[2450]: I0317 17:52:57.400387 2450 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:52:57.400718 kubelet[2450]: I0317 17:52:57.400668 2450 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 17:52:57.400835 kubelet[2450]: I0317 17:52:57.400711 2450 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:52:57.401973 kubelet[2450]: I0317 17:52:57.401882 2450 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:52:57.405739 kubelet[2450]: E0317 17:52:57.405269 2450 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 17 17:52:57.405739 kubelet[2450]: E0317 17:52:57.405339 2450 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"172.31.27.21\" not found" Mar 17 17:52:57.430785 kubelet[2450]: I0317 17:52:57.430358 2450 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:52:57.436157 kubelet[2450]: I0317 17:52:57.435928 2450 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:52:57.436157 kubelet[2450]: I0317 17:52:57.435976 2450 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 17 17:52:57.436157 kubelet[2450]: I0317 17:52:57.436012 2450 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 17 17:52:57.436157 kubelet[2450]: I0317 17:52:57.436027 2450 kubelet.go:2388] "Starting kubelet main sync loop" Mar 17 17:52:57.436157 kubelet[2450]: E0317 17:52:57.436153 2450 kubelet.go:2412] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 17 17:52:57.501996 kubelet[2450]: I0317 17:52:57.501925 2450 kubelet_node_status.go:76] "Attempting to register node" node="172.31.27.21" Mar 17 17:52:57.532272 kubelet[2450]: I0317 17:52:57.532206 2450 kubelet_node_status.go:79] "Successfully registered node" node="172.31.27.21" Mar 17 17:52:57.532272 kubelet[2450]: E0317 17:52:57.532257 2450 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"172.31.27.21\": node \"172.31.27.21\" not found" Mar 17 17:52:57.569961 kubelet[2450]: E0317 17:52:57.569897 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.670199 kubelet[2450]: E0317 17:52:57.670135 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.771225 kubelet[2450]: E0317 17:52:57.771167 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.871874 kubelet[2450]: E0317 17:52:57.871826 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.972607 kubelet[2450]: E0317 17:52:57.972549 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:57.976957 sudo[2306]: pam_unix(sudo:session): session closed for user root Mar 17 17:52:57.999849 sshd[2305]: Connection closed by 147.75.109.163 port 45740 Mar 17 17:52:58.000734 sshd-session[2303]: pam_unix(sshd:session): session closed for user core Mar 17 17:52:58.006317 systemd[1]: sshd@8-172.31.27.21:22-147.75.109.163:45740.service: Deactivated successfully. Mar 17 17:52:58.010443 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:52:58.011033 systemd[1]: session-9.scope: Consumed 694ms CPU time, 73.4M memory peak. Mar 17 17:52:58.014551 systemd-logind[1931]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:52:58.016798 systemd-logind[1931]: Removed session 9. Mar 17 17:52:58.073495 kubelet[2450]: E0317 17:52:58.073341 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:58.174118 kubelet[2450]: E0317 17:52:58.174056 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:58.180307 kubelet[2450]: I0317 17:52:58.180270 2450 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 17 17:52:58.180507 kubelet[2450]: W0317 17:52:58.180465 2450 reflector.go:492] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 17 17:52:58.263138 kubelet[2450]: E0317 17:52:58.263071 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:52:58.274963 kubelet[2450]: E0317 17:52:58.274902 2450 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"172.31.27.21\" not found" Mar 17 17:52:58.376506 kubelet[2450]: I0317 17:52:58.376353 2450 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Mar 17 17:52:58.377194 containerd[1951]: time="2025-03-17T17:52:58.376913729Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:52:58.377782 kubelet[2450]: I0317 17:52:58.377598 2450 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Mar 17 17:52:59.263775 kubelet[2450]: E0317 17:52:59.263720 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:52:59.263775 kubelet[2450]: I0317 17:52:59.263729 2450 apiserver.go:52] "Watching apiserver" Mar 17 17:52:59.274713 kubelet[2450]: E0317 17:52:59.273891 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:52:59.285538 systemd[1]: Created slice kubepods-besteffort-pod8521a20c_1ac1_4529_ae16_9c002f97336c.slice - libcontainer container kubepods-besteffort-pod8521a20c_1ac1_4529_ae16_9c002f97336c.slice. Mar 17 17:52:59.289908 kubelet[2450]: I0317 17:52:59.289848 2450 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:52:59.301542 kubelet[2450]: I0317 17:52:59.300505 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8a712d71-7427-4200-94a3-b1d8ea1fe150-socket-dir\") pod \"csi-node-driver-s8xfx\" (UID: \"8a712d71-7427-4200-94a3-b1d8ea1fe150\") " pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:52:59.301542 kubelet[2450]: I0317 17:52:59.300564 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-lib-modules\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301542 kubelet[2450]: I0317 17:52:59.300605 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-cni-net-dir\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301542 kubelet[2450]: I0317 17:52:59.300644 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-cni-log-dir\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301542 kubelet[2450]: I0317 17:52:59.300717 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-flexvol-driver-host\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301923 kubelet[2450]: I0317 17:52:59.300761 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-node-certs\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301923 kubelet[2450]: I0317 17:52:59.300800 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-var-lib-calico\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301923 kubelet[2450]: I0317 17:52:59.300851 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-cni-bin-dir\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301923 kubelet[2450]: I0317 17:52:59.300896 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn897\" (UniqueName: \"kubernetes.io/projected/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-kube-api-access-sn897\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.301923 kubelet[2450]: I0317 17:52:59.300945 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a712d71-7427-4200-94a3-b1d8ea1fe150-kubelet-dir\") pod \"csi-node-driver-s8xfx\" (UID: \"8a712d71-7427-4200-94a3-b1d8ea1fe150\") " pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:52:59.302191 kubelet[2450]: I0317 17:52:59.300980 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8a712d71-7427-4200-94a3-b1d8ea1fe150-registration-dir\") pod \"csi-node-driver-s8xfx\" (UID: \"8a712d71-7427-4200-94a3-b1d8ea1fe150\") " pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:52:59.302191 kubelet[2450]: I0317 17:52:59.301016 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8521a20c-1ac1-4529-ae16-9c002f97336c-kube-proxy\") pod \"kube-proxy-kf4df\" (UID: \"8521a20c-1ac1-4529-ae16-9c002f97336c\") " pod="kube-system/kube-proxy-kf4df" Mar 17 17:52:59.302191 kubelet[2450]: I0317 17:52:59.301054 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8521a20c-1ac1-4529-ae16-9c002f97336c-xtables-lock\") pod \"kube-proxy-kf4df\" (UID: \"8521a20c-1ac1-4529-ae16-9c002f97336c\") " pod="kube-system/kube-proxy-kf4df" Mar 17 17:52:59.302191 kubelet[2450]: I0317 17:52:59.301089 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-policysync\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.302191 kubelet[2450]: I0317 17:52:59.301123 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-tigera-ca-bundle\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.302430 kubelet[2450]: I0317 17:52:59.301159 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-var-run-calico\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.302430 kubelet[2450]: I0317 17:52:59.301203 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8521a20c-1ac1-4529-ae16-9c002f97336c-lib-modules\") pod \"kube-proxy-kf4df\" (UID: \"8521a20c-1ac1-4529-ae16-9c002f97336c\") " pod="kube-system/kube-proxy-kf4df" Mar 17 17:52:59.302430 kubelet[2450]: I0317 17:52:59.301240 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxfr\" (UniqueName: \"kubernetes.io/projected/8521a20c-1ac1-4529-ae16-9c002f97336c-kube-api-access-chxfr\") pod \"kube-proxy-kf4df\" (UID: \"8521a20c-1ac1-4529-ae16-9c002f97336c\") " pod="kube-system/kube-proxy-kf4df" Mar 17 17:52:59.302430 kubelet[2450]: I0317 17:52:59.301276 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5-xtables-lock\") pod \"calico-node-5sxs8\" (UID: \"4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5\") " pod="calico-system/calico-node-5sxs8" Mar 17 17:52:59.302430 kubelet[2450]: I0317 17:52:59.301311 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8a712d71-7427-4200-94a3-b1d8ea1fe150-varrun\") pod \"csi-node-driver-s8xfx\" (UID: \"8a712d71-7427-4200-94a3-b1d8ea1fe150\") " pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:52:59.302651 kubelet[2450]: I0317 17:52:59.301353 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzb24\" (UniqueName: \"kubernetes.io/projected/8a712d71-7427-4200-94a3-b1d8ea1fe150-kube-api-access-nzb24\") pod \"csi-node-driver-s8xfx\" (UID: \"8a712d71-7427-4200-94a3-b1d8ea1fe150\") " pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:52:59.309666 systemd[1]: Created slice kubepods-besteffort-pod4de372a2_0f36_4201_b1c6_c1cc1fbb5bf5.slice - libcontainer container kubepods-besteffort-pod4de372a2_0f36_4201_b1c6_c1cc1fbb5bf5.slice. Mar 17 17:52:59.407206 kubelet[2450]: E0317 17:52:59.407147 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:52:59.407332 kubelet[2450]: W0317 17:52:59.407191 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:52:59.407332 kubelet[2450]: E0317 17:52:59.407250 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:52:59.410595 kubelet[2450]: E0317 17:52:59.410548 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:52:59.410595 kubelet[2450]: W0317 17:52:59.410584 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:52:59.410891 kubelet[2450]: E0317 17:52:59.410618 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:52:59.436164 kubelet[2450]: E0317 17:52:59.436081 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:52:59.436164 kubelet[2450]: W0317 17:52:59.436115 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:52:59.438589 kubelet[2450]: E0317 17:52:59.438561 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:52:59.456992 kubelet[2450]: E0317 17:52:59.456953 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:52:59.457189 kubelet[2450]: W0317 17:52:59.457141 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:52:59.457550 kubelet[2450]: E0317 17:52:59.457529 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:52:59.457741 kubelet[2450]: W0317 17:52:59.457632 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:52:59.457741 kubelet[2450]: E0317 17:52:59.457664 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:52:59.457901 kubelet[2450]: E0317 17:52:59.457862 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:52:59.599480 containerd[1951]: time="2025-03-17T17:52:59.599328391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kf4df,Uid:8521a20c-1ac1-4529-ae16-9c002f97336c,Namespace:kube-system,Attempt:0,}" Mar 17 17:52:59.617660 containerd[1951]: time="2025-03-17T17:52:59.617503963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5sxs8,Uid:4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5,Namespace:calico-system,Attempt:0,}" Mar 17 17:53:00.155778 containerd[1951]: time="2025-03-17T17:53:00.155568342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:53:00.157487 containerd[1951]: time="2025-03-17T17:53:00.157427454Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:53:00.158707 containerd[1951]: time="2025-03-17T17:53:00.158593146Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269173" Mar 17 17:53:00.159754 containerd[1951]: time="2025-03-17T17:53:00.159627054Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:53:00.160661 containerd[1951]: time="2025-03-17T17:53:00.160586094Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:53:00.163824 containerd[1951]: time="2025-03-17T17:53:00.163778874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:53:00.167253 containerd[1951]: time="2025-03-17T17:53:00.167036370Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 567.571299ms" Mar 17 17:53:00.172388 containerd[1951]: time="2025-03-17T17:53:00.172324122Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 554.698395ms" Mar 17 17:53:00.264138 kubelet[2450]: E0317 17:53:00.264058 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:00.418314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1419766044.mount: Deactivated successfully. Mar 17 17:53:00.437265 kubelet[2450]: E0317 17:53:00.437186 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:00.662526 containerd[1951]: time="2025-03-17T17:53:00.662011916Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:53:00.662526 containerd[1951]: time="2025-03-17T17:53:00.662155316Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:53:00.662526 containerd[1951]: time="2025-03-17T17:53:00.662191280Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:00.662526 containerd[1951]: time="2025-03-17T17:53:00.662348552Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:00.663523 containerd[1951]: time="2025-03-17T17:53:00.662680352Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:53:00.664190 containerd[1951]: time="2025-03-17T17:53:00.663603716Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:53:00.664504 containerd[1951]: time="2025-03-17T17:53:00.664344680Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:00.664952 containerd[1951]: time="2025-03-17T17:53:00.664869932Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:00.812081 systemd[1]: run-containerd-runc-k8s.io-b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10-runc.kmBNmU.mount: Deactivated successfully. Mar 17 17:53:00.828020 systemd[1]: Started cri-containerd-b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10.scope - libcontainer container b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10. Mar 17 17:53:00.843200 systemd[1]: Started cri-containerd-8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f.scope - libcontainer container 8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f. Mar 17 17:53:00.896484 containerd[1951]: time="2025-03-17T17:53:00.896171553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kf4df,Uid:8521a20c-1ac1-4529-ae16-9c002f97336c,Namespace:kube-system,Attempt:0,} returns sandbox id \"b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10\"" Mar 17 17:53:00.902868 containerd[1951]: time="2025-03-17T17:53:00.902550621Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 17 17:53:00.906480 containerd[1951]: time="2025-03-17T17:53:00.906407841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5sxs8,Uid:4de372a2-0f36-4201-b1c6-c1cc1fbb5bf5,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\"" Mar 17 17:53:01.264650 kubelet[2450]: E0317 17:53:01.264568 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:02.255585 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3945852306.mount: Deactivated successfully. Mar 17 17:53:02.265038 kubelet[2450]: E0317 17:53:02.264994 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:02.437026 kubelet[2450]: E0317 17:53:02.436920 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:02.817126 containerd[1951]: time="2025-03-17T17:53:02.817058735Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:02.818544 containerd[1951]: time="2025-03-17T17:53:02.818475179Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=27370095" Mar 17 17:53:02.820470 containerd[1951]: time="2025-03-17T17:53:02.820384631Z" level=info msg="ImageCreate event name:\"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:02.824664 containerd[1951]: time="2025-03-17T17:53:02.824584211Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:02.826185 containerd[1951]: time="2025-03-17T17:53:02.825938063Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"27369114\" in 1.92332827s" Mar 17 17:53:02.826185 containerd[1951]: time="2025-03-17T17:53:02.825986555Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\"" Mar 17 17:53:02.829038 containerd[1951]: time="2025-03-17T17:53:02.828982715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:53:02.831301 containerd[1951]: time="2025-03-17T17:53:02.831106247Z" level=info msg="CreateContainer within sandbox \"b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:53:02.854280 containerd[1951]: time="2025-03-17T17:53:02.854227655Z" level=info msg="CreateContainer within sandbox \"b9caae3fe9315ab8ff465f633f9ce72001f9ffa970f5e455853e7bfaf4916d10\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e05ffdf3ba874e6c11cbb9a552f058dbcf1a1aee2cb8a001c770a2c862a62fa7\"" Mar 17 17:53:02.855777 containerd[1951]: time="2025-03-17T17:53:02.855734543Z" level=info msg="StartContainer for \"e05ffdf3ba874e6c11cbb9a552f058dbcf1a1aee2cb8a001c770a2c862a62fa7\"" Mar 17 17:53:02.915015 systemd[1]: Started cri-containerd-e05ffdf3ba874e6c11cbb9a552f058dbcf1a1aee2cb8a001c770a2c862a62fa7.scope - libcontainer container e05ffdf3ba874e6c11cbb9a552f058dbcf1a1aee2cb8a001c770a2c862a62fa7. Mar 17 17:53:02.969958 containerd[1951]: time="2025-03-17T17:53:02.969860196Z" level=info msg="StartContainer for \"e05ffdf3ba874e6c11cbb9a552f058dbcf1a1aee2cb8a001c770a2c862a62fa7\" returns successfully" Mar 17 17:53:03.266562 kubelet[2450]: E0317 17:53:03.266499 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:03.487550 kubelet[2450]: I0317 17:53:03.487436 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-kf4df" podStartSLOduration=4.5610176 podStartE2EDuration="6.487415818s" podCreationTimestamp="2025-03-17 17:52:57 +0000 UTC" firstStartedPulling="2025-03-17 17:53:00.901460997 +0000 UTC m=+5.462087104" lastFinishedPulling="2025-03-17 17:53:02.827859119 +0000 UTC m=+7.388485322" observedRunningTime="2025-03-17 17:53:03.483593794 +0000 UTC m=+8.044219925" watchObservedRunningTime="2025-03-17 17:53:03.487415818 +0000 UTC m=+8.048041949" Mar 17 17:53:03.516067 kubelet[2450]: E0317 17:53:03.516007 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.516350 kubelet[2450]: W0317 17:53:03.516150 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.516350 kubelet[2450]: E0317 17:53:03.516187 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.517541 kubelet[2450]: E0317 17:53:03.517072 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.517541 kubelet[2450]: W0317 17:53:03.517105 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.517541 kubelet[2450]: E0317 17:53:03.517179 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.518395 kubelet[2450]: E0317 17:53:03.518070 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.518395 kubelet[2450]: W0317 17:53:03.518101 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.518395 kubelet[2450]: E0317 17:53:03.518157 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.519596 kubelet[2450]: E0317 17:53:03.519172 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.519596 kubelet[2450]: W0317 17:53:03.519203 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.519596 kubelet[2450]: E0317 17:53:03.519234 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.520559 kubelet[2450]: E0317 17:53:03.520260 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.520559 kubelet[2450]: W0317 17:53:03.520293 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.520559 kubelet[2450]: E0317 17:53:03.520347 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.521519 kubelet[2450]: E0317 17:53:03.521224 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.521519 kubelet[2450]: W0317 17:53:03.521255 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.521519 kubelet[2450]: E0317 17:53:03.521282 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.522371 kubelet[2450]: E0317 17:53:03.522113 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.522371 kubelet[2450]: W0317 17:53:03.522142 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.522371 kubelet[2450]: E0317 17:53:03.522194 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.523102 kubelet[2450]: E0317 17:53:03.522846 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.523102 kubelet[2450]: W0317 17:53:03.522875 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.523102 kubelet[2450]: E0317 17:53:03.522905 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.523820 kubelet[2450]: E0317 17:53:03.523564 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.523820 kubelet[2450]: W0317 17:53:03.523590 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.523820 kubelet[2450]: E0317 17:53:03.523616 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.524473 kubelet[2450]: E0317 17:53:03.524266 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.524473 kubelet[2450]: W0317 17:53:03.524293 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.524473 kubelet[2450]: E0317 17:53:03.524318 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.524916 kubelet[2450]: E0317 17:53:03.524890 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.525144 kubelet[2450]: W0317 17:53:03.525026 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.525144 kubelet[2450]: E0317 17:53:03.525061 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.525819 kubelet[2450]: E0317 17:53:03.525607 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.525819 kubelet[2450]: W0317 17:53:03.525632 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.525819 kubelet[2450]: E0317 17:53:03.525657 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.526338 kubelet[2450]: E0317 17:53:03.526311 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.526587 kubelet[2450]: W0317 17:53:03.526467 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.526587 kubelet[2450]: E0317 17:53:03.526504 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.527213 kubelet[2450]: E0317 17:53:03.527014 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.527213 kubelet[2450]: W0317 17:53:03.527040 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.527213 kubelet[2450]: E0317 17:53:03.527065 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.527896 kubelet[2450]: E0317 17:53:03.527566 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.527896 kubelet[2450]: W0317 17:53:03.527588 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.527896 kubelet[2450]: E0317 17:53:03.527613 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.528665 kubelet[2450]: E0317 17:53:03.528419 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.528665 kubelet[2450]: W0317 17:53:03.528448 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.528665 kubelet[2450]: E0317 17:53:03.528476 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.529313 kubelet[2450]: E0317 17:53:03.529094 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.529313 kubelet[2450]: W0317 17:53:03.529120 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.529313 kubelet[2450]: E0317 17:53:03.529147 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.529859 kubelet[2450]: E0317 17:53:03.529668 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.529859 kubelet[2450]: W0317 17:53:03.529750 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.529859 kubelet[2450]: E0317 17:53:03.529775 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.530500 kubelet[2450]: E0317 17:53:03.530367 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.530500 kubelet[2450]: W0317 17:53:03.530394 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.530500 kubelet[2450]: E0317 17:53:03.530419 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.531328 kubelet[2450]: E0317 17:53:03.531041 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.531328 kubelet[2450]: W0317 17:53:03.531067 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.531328 kubelet[2450]: E0317 17:53:03.531091 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.531901 kubelet[2450]: E0317 17:53:03.531873 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.532218 kubelet[2450]: W0317 17:53:03.532036 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.532218 kubelet[2450]: E0317 17:53:03.532074 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.532904 kubelet[2450]: E0317 17:53:03.532670 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.532904 kubelet[2450]: W0317 17:53:03.532731 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.532904 kubelet[2450]: E0317 17:53:03.532771 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.533581 kubelet[2450]: E0317 17:53:03.533387 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.533581 kubelet[2450]: W0317 17:53:03.533414 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.533581 kubelet[2450]: E0317 17:53:03.533439 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.534052 kubelet[2450]: E0317 17:53:03.533975 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.534052 kubelet[2450]: W0317 17:53:03.534001 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.534416 kubelet[2450]: E0317 17:53:03.534271 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.534955 kubelet[2450]: E0317 17:53:03.534722 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.534955 kubelet[2450]: W0317 17:53:03.534749 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.534955 kubelet[2450]: E0317 17:53:03.534782 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.535409 kubelet[2450]: E0317 17:53:03.535311 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.535409 kubelet[2450]: W0317 17:53:03.535337 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.535409 kubelet[2450]: E0317 17:53:03.535374 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.535954 kubelet[2450]: E0317 17:53:03.535918 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.535954 kubelet[2450]: W0317 17:53:03.535952 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.536178 kubelet[2450]: E0317 17:53:03.535994 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.537325 kubelet[2450]: E0317 17:53:03.536800 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.537325 kubelet[2450]: W0317 17:53:03.536827 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.537325 kubelet[2450]: E0317 17:53:03.536861 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.537946 kubelet[2450]: E0317 17:53:03.537919 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.538097 kubelet[2450]: W0317 17:53:03.538070 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.538333 kubelet[2450]: E0317 17:53:03.538305 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.538994 kubelet[2450]: E0317 17:53:03.538822 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.538994 kubelet[2450]: W0317 17:53:03.538848 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.538994 kubelet[2450]: E0317 17:53:03.538875 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.539954 kubelet[2450]: E0317 17:53:03.539464 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.539954 kubelet[2450]: W0317 17:53:03.539489 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.539954 kubelet[2450]: E0317 17:53:03.539514 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:03.540927 kubelet[2450]: E0317 17:53:03.540813 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:03.540927 kubelet[2450]: W0317 17:53:03.540843 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:03.540927 kubelet[2450]: E0317 17:53:03.540874 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.266879 kubelet[2450]: E0317 17:53:04.266778 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:04.292401 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3300382517.mount: Deactivated successfully. Mar 17 17:53:04.422347 containerd[1951]: time="2025-03-17T17:53:04.422195843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:04.423962 containerd[1951]: time="2025-03-17T17:53:04.423879383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=6490047" Mar 17 17:53:04.425186 containerd[1951]: time="2025-03-17T17:53:04.425112191Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:04.428785 containerd[1951]: time="2025-03-17T17:53:04.428705303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:04.430310 containerd[1951]: time="2025-03-17T17:53:04.430248719Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.601205104s" Mar 17 17:53:04.431756 containerd[1951]: time="2025-03-17T17:53:04.430306835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 17 17:53:04.434976 containerd[1951]: time="2025-03-17T17:53:04.434920103Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:53:04.437106 kubelet[2450]: E0317 17:53:04.436637 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:04.461253 containerd[1951]: time="2025-03-17T17:53:04.461176631Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4\"" Mar 17 17:53:04.466413 containerd[1951]: time="2025-03-17T17:53:04.466210787Z" level=info msg="StartContainer for \"353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4\"" Mar 17 17:53:04.517252 systemd[1]: Started cri-containerd-353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4.scope - libcontainer container 353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4. Mar 17 17:53:04.538568 kubelet[2450]: E0317 17:53:04.538522 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.538568 kubelet[2450]: W0317 17:53:04.538562 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.538782 kubelet[2450]: E0317 17:53:04.538596 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.539622 kubelet[2450]: E0317 17:53:04.539430 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.539622 kubelet[2450]: W0317 17:53:04.539457 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.539622 kubelet[2450]: E0317 17:53:04.539484 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.540334 kubelet[2450]: E0317 17:53:04.540134 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.540334 kubelet[2450]: W0317 17:53:04.540158 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.540334 kubelet[2450]: E0317 17:53:04.540181 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.541299 kubelet[2450]: E0317 17:53:04.540853 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.541299 kubelet[2450]: W0317 17:53:04.541002 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.541299 kubelet[2450]: E0317 17:53:04.541033 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.542273 kubelet[2450]: E0317 17:53:04.542039 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.542273 kubelet[2450]: W0317 17:53:04.542101 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.542273 kubelet[2450]: E0317 17:53:04.542128 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.543029 kubelet[2450]: E0317 17:53:04.542899 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.543029 kubelet[2450]: W0317 17:53:04.542959 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.543029 kubelet[2450]: E0317 17:53:04.542984 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.543760 kubelet[2450]: E0317 17:53:04.543706 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.543977 kubelet[2450]: W0317 17:53:04.543865 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.543977 kubelet[2450]: E0317 17:53:04.543894 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.544842 kubelet[2450]: E0317 17:53:04.544600 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.544842 kubelet[2450]: W0317 17:53:04.544625 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.544842 kubelet[2450]: E0317 17:53:04.544648 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.545970 kubelet[2450]: E0317 17:53:04.545568 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.545970 kubelet[2450]: W0317 17:53:04.545619 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.545970 kubelet[2450]: E0317 17:53:04.545645 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.546390 kubelet[2450]: E0317 17:53:04.546366 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.546650 kubelet[2450]: W0317 17:53:04.546467 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.546650 kubelet[2450]: E0317 17:53:04.546499 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.547343 kubelet[2450]: E0317 17:53:04.547091 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.547343 kubelet[2450]: W0317 17:53:04.547117 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.547343 kubelet[2450]: E0317 17:53:04.547141 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.547772 kubelet[2450]: E0317 17:53:04.547638 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.550672 kubelet[2450]: W0317 17:53:04.550628 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.551474 kubelet[2450]: E0317 17:53:04.551112 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.552872 kubelet[2450]: E0317 17:53:04.552572 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.552872 kubelet[2450]: W0317 17:53:04.552606 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.552872 kubelet[2450]: E0317 17:53:04.552663 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.554504 kubelet[2450]: E0317 17:53:04.553904 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.554504 kubelet[2450]: W0317 17:53:04.553937 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.554504 kubelet[2450]: E0317 17:53:04.553963 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.555036 kubelet[2450]: E0317 17:53:04.554976 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.555382 kubelet[2450]: W0317 17:53:04.555004 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.555382 kubelet[2450]: E0317 17:53:04.555178 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.556219 kubelet[2450]: E0317 17:53:04.556177 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.556447 kubelet[2450]: W0317 17:53:04.556322 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.556447 kubelet[2450]: E0317 17:53:04.556356 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.558386 kubelet[2450]: E0317 17:53:04.557713 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.558386 kubelet[2450]: W0317 17:53:04.557747 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.558386 kubelet[2450]: E0317 17:53:04.557774 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.559506 kubelet[2450]: E0317 17:53:04.558968 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.559506 kubelet[2450]: W0317 17:53:04.559002 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.559506 kubelet[2450]: E0317 17:53:04.559033 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.561335 kubelet[2450]: E0317 17:53:04.560994 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.561335 kubelet[2450]: W0317 17:53:04.561026 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.561335 kubelet[2450]: E0317 17:53:04.561055 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.563117 kubelet[2450]: E0317 17:53:04.563068 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.563117 kubelet[2450]: W0317 17:53:04.563112 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.563246 kubelet[2450]: E0317 17:53:04.563145 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.564892 kubelet[2450]: E0317 17:53:04.564836 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.564892 kubelet[2450]: W0317 17:53:04.564885 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.565510 kubelet[2450]: E0317 17:53:04.564917 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.565588 kubelet[2450]: E0317 17:53:04.565551 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.565588 kubelet[2450]: W0317 17:53:04.565574 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.567495 kubelet[2450]: E0317 17:53:04.566749 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.567495 kubelet[2450]: E0317 17:53:04.567227 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.567495 kubelet[2450]: W0317 17:53:04.567249 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.567495 kubelet[2450]: E0317 17:53:04.567271 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.568272 kubelet[2450]: E0317 17:53:04.568227 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.568272 kubelet[2450]: W0317 17:53:04.568268 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.568430 kubelet[2450]: E0317 17:53:04.568299 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.569438 kubelet[2450]: E0317 17:53:04.569401 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.569438 kubelet[2450]: W0317 17:53:04.569435 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.570529 kubelet[2450]: E0317 17:53:04.569473 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.570633 kubelet[2450]: E0317 17:53:04.570577 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.570729 kubelet[2450]: W0317 17:53:04.570628 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.570862 kubelet[2450]: E0317 17:53:04.570826 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.571723 kubelet[2450]: E0317 17:53:04.571651 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.571847 kubelet[2450]: W0317 17:53:04.571739 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.571847 kubelet[2450]: E0317 17:53:04.571778 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.572255 kubelet[2450]: E0317 17:53:04.572218 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.572255 kubelet[2450]: W0317 17:53:04.572248 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.572389 kubelet[2450]: E0317 17:53:04.572323 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.573293 kubelet[2450]: E0317 17:53:04.572952 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.573293 kubelet[2450]: W0317 17:53:04.572985 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.573293 kubelet[2450]: E0317 17:53:04.573157 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.573522 kubelet[2450]: E0317 17:53:04.573336 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.573522 kubelet[2450]: W0317 17:53:04.573354 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.573522 kubelet[2450]: E0317 17:53:04.573380 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.574112 kubelet[2450]: E0317 17:53:04.573807 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.574112 kubelet[2450]: W0317 17:53:04.573836 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.574112 kubelet[2450]: E0317 17:53:04.573859 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.574600 kubelet[2450]: E0317 17:53:04.574461 2450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:53:04.574600 kubelet[2450]: W0317 17:53:04.574491 2450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:53:04.574600 kubelet[2450]: E0317 17:53:04.574516 2450 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:53:04.585942 containerd[1951]: time="2025-03-17T17:53:04.585754188Z" level=info msg="StartContainer for \"353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4\" returns successfully" Mar 17 17:53:04.606669 systemd[1]: cri-containerd-353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4.scope: Deactivated successfully. Mar 17 17:53:04.824914 containerd[1951]: time="2025-03-17T17:53:04.824511601Z" level=info msg="shim disconnected" id=353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4 namespace=k8s.io Mar 17 17:53:04.824914 containerd[1951]: time="2025-03-17T17:53:04.824589445Z" level=warning msg="cleaning up after shim disconnected" id=353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4 namespace=k8s.io Mar 17 17:53:04.824914 containerd[1951]: time="2025-03-17T17:53:04.824610577Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:53:05.249277 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-353603abcd0e0fd4abb0049df7d575eb2461b2d8f042a5b1a0841e12a82738a4-rootfs.mount: Deactivated successfully. Mar 17 17:53:05.267676 kubelet[2450]: E0317 17:53:05.267618 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:05.478819 containerd[1951]: time="2025-03-17T17:53:05.478599132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:53:06.268653 kubelet[2450]: E0317 17:53:06.268586 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:06.437358 kubelet[2450]: E0317 17:53:06.437281 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:07.268898 kubelet[2450]: E0317 17:53:07.268847 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:07.418300 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 17 17:53:08.269858 kubelet[2450]: E0317 17:53:08.269795 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:08.438065 kubelet[2450]: E0317 17:53:08.437277 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:08.974167 containerd[1951]: time="2025-03-17T17:53:08.974112762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:08.975925 containerd[1951]: time="2025-03-17T17:53:08.975833370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 17 17:53:08.976620 containerd[1951]: time="2025-03-17T17:53:08.976179282Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:08.980423 containerd[1951]: time="2025-03-17T17:53:08.980344410Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:08.982174 containerd[1951]: time="2025-03-17T17:53:08.982005750Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 3.50332791s" Mar 17 17:53:08.982174 containerd[1951]: time="2025-03-17T17:53:08.982059174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 17 17:53:08.986278 containerd[1951]: time="2025-03-17T17:53:08.986026938Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:53:09.017447 containerd[1951]: time="2025-03-17T17:53:09.017393990Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045\"" Mar 17 17:53:09.018744 containerd[1951]: time="2025-03-17T17:53:09.018597026Z" level=info msg="StartContainer for \"e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045\"" Mar 17 17:53:09.076016 systemd[1]: Started cri-containerd-e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045.scope - libcontainer container e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045. Mar 17 17:53:09.136869 containerd[1951]: time="2025-03-17T17:53:09.136672526Z" level=info msg="StartContainer for \"e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045\" returns successfully" Mar 17 17:53:09.270707 kubelet[2450]: E0317 17:53:09.270606 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:10.038140 containerd[1951]: time="2025-03-17T17:53:10.038065143Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:53:10.043454 systemd[1]: cri-containerd-e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045.scope: Deactivated successfully. Mar 17 17:53:10.044049 systemd[1]: cri-containerd-e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045.scope: Consumed 864ms CPU time, 172.6M memory peak, 150.3M written to disk. Mar 17 17:53:10.052721 kubelet[2450]: I0317 17:53:10.050740 2450 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 17 17:53:10.083768 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045-rootfs.mount: Deactivated successfully. Mar 17 17:53:10.271293 kubelet[2450]: E0317 17:53:10.271231 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:10.446245 systemd[1]: Created slice kubepods-besteffort-pod8a712d71_7427_4200_94a3_b1d8ea1fe150.slice - libcontainer container kubepods-besteffort-pod8a712d71_7427_4200_94a3_b1d8ea1fe150.slice. Mar 17 17:53:10.450774 containerd[1951]: time="2025-03-17T17:53:10.450641129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:0,}" Mar 17 17:53:10.993454 systemd[1]: Created slice kubepods-besteffort-pod853570a2_8489_4178_8082_afac70eb226b.slice - libcontainer container kubepods-besteffort-pod853570a2_8489_4178_8082_afac70eb226b.slice. Mar 17 17:53:11.008795 kubelet[2450]: I0317 17:53:11.007908 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlf96\" (UniqueName: \"kubernetes.io/projected/853570a2-8489-4178-8082-afac70eb226b-kube-api-access-rlf96\") pod \"nginx-deployment-7fcdb87857-8qxmr\" (UID: \"853570a2-8489-4178-8082-afac70eb226b\") " pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:11.013494 containerd[1951]: time="2025-03-17T17:53:11.013294624Z" level=info msg="shim disconnected" id=e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045 namespace=k8s.io Mar 17 17:53:11.013494 containerd[1951]: time="2025-03-17T17:53:11.013366192Z" level=warning msg="cleaning up after shim disconnected" id=e87fca531b3b434fa60a686c83d9b3a00c2359b9b6ee3bafcdc1689777645045 namespace=k8s.io Mar 17 17:53:11.013494 containerd[1951]: time="2025-03-17T17:53:11.013385356Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:53:11.111585 containerd[1951]: time="2025-03-17T17:53:11.111399856Z" level=error msg="Failed to destroy network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.112932 containerd[1951]: time="2025-03-17T17:53:11.112597408Z" level=error msg="encountered an error cleaning up failed sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.112932 containerd[1951]: time="2025-03-17T17:53:11.112739608Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.113938 kubelet[2450]: E0317 17:53:11.113871 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.114085 kubelet[2450]: E0317 17:53:11.113999 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:11.114085 kubelet[2450]: E0317 17:53:11.114037 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:11.114202 kubelet[2450]: E0317 17:53:11.114100 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:11.115125 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae-shm.mount: Deactivated successfully. Mar 17 17:53:11.272067 kubelet[2450]: E0317 17:53:11.272011 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:11.298954 containerd[1951]: time="2025-03-17T17:53:11.298876565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:0,}" Mar 17 17:53:11.402020 containerd[1951]: time="2025-03-17T17:53:11.401929410Z" level=error msg="Failed to destroy network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.404756 containerd[1951]: time="2025-03-17T17:53:11.402486330Z" level=error msg="encountered an error cleaning up failed sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.404756 containerd[1951]: time="2025-03-17T17:53:11.402573366Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.404996 kubelet[2450]: E0317 17:53:11.402915 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.404996 kubelet[2450]: E0317 17:53:11.402991 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:11.404996 kubelet[2450]: E0317 17:53:11.403024 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:11.405258 kubelet[2450]: E0317 17:53:11.403089 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:11.405664 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d-shm.mount: Deactivated successfully. Mar 17 17:53:11.500917 kubelet[2450]: I0317 17:53:11.500881 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d" Mar 17 17:53:11.502981 containerd[1951]: time="2025-03-17T17:53:11.502351482Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:11.502981 containerd[1951]: time="2025-03-17T17:53:11.502666614Z" level=info msg="Ensure that sandbox e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d in task-service has been cleanup successfully" Mar 17 17:53:11.503462 containerd[1951]: time="2025-03-17T17:53:11.503269962Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:11.503462 containerd[1951]: time="2025-03-17T17:53:11.503307210Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:11.504742 containerd[1951]: time="2025-03-17T17:53:11.504426738Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:1,}" Mar 17 17:53:11.505892 kubelet[2450]: I0317 17:53:11.505853 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae" Mar 17 17:53:11.507033 containerd[1951]: time="2025-03-17T17:53:11.506679150Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:11.507033 containerd[1951]: time="2025-03-17T17:53:11.506998866Z" level=info msg="Ensure that sandbox 0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae in task-service has been cleanup successfully" Mar 17 17:53:11.508372 systemd[1]: run-netns-cni\x2d70c35ad6\x2d74b6\x2ddc3a\x2d0f1d\x2d324394e71dd3.mount: Deactivated successfully. Mar 17 17:53:11.509854 containerd[1951]: time="2025-03-17T17:53:11.508774314Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:11.509854 containerd[1951]: time="2025-03-17T17:53:11.508814082Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:11.510992 containerd[1951]: time="2025-03-17T17:53:11.510929754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:1,}" Mar 17 17:53:11.519356 containerd[1951]: time="2025-03-17T17:53:11.519280638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:53:11.648341 containerd[1951]: time="2025-03-17T17:53:11.647936467Z" level=error msg="Failed to destroy network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.648898 containerd[1951]: time="2025-03-17T17:53:11.648524947Z" level=error msg="encountered an error cleaning up failed sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.648898 containerd[1951]: time="2025-03-17T17:53:11.648626671Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.649933 kubelet[2450]: E0317 17:53:11.649127 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.649933 kubelet[2450]: E0317 17:53:11.649222 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:11.649933 kubelet[2450]: E0317 17:53:11.649257 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:11.650225 kubelet[2450]: E0317 17:53:11.649319 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:11.664049 containerd[1951]: time="2025-03-17T17:53:11.663974179Z" level=error msg="Failed to destroy network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.664757 containerd[1951]: time="2025-03-17T17:53:11.664708711Z" level=error msg="encountered an error cleaning up failed sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.664889 containerd[1951]: time="2025-03-17T17:53:11.664823599Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.665246 kubelet[2450]: E0317 17:53:11.665181 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:11.665343 kubelet[2450]: E0317 17:53:11.665264 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:11.665343 kubelet[2450]: E0317 17:53:11.665304 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:11.665469 kubelet[2450]: E0317 17:53:11.665370 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:12.116269 systemd[1]: run-netns-cni\x2d38876df5\x2d4598\x2d222b\x2d28d6\x2dd46a7f91937e.mount: Deactivated successfully. Mar 17 17:53:12.273128 kubelet[2450]: E0317 17:53:12.273064 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:12.520109 kubelet[2450]: I0317 17:53:12.519132 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106" Mar 17 17:53:12.520332 containerd[1951]: time="2025-03-17T17:53:12.520174687Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:12.522357 containerd[1951]: time="2025-03-17T17:53:12.520441183Z" level=info msg="Ensure that sandbox 93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106 in task-service has been cleanup successfully" Mar 17 17:53:12.522357 containerd[1951]: time="2025-03-17T17:53:12.521005027Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:12.522357 containerd[1951]: time="2025-03-17T17:53:12.521035327Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:12.522357 containerd[1951]: time="2025-03-17T17:53:12.522292111Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:12.522570 containerd[1951]: time="2025-03-17T17:53:12.522489235Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:12.522570 containerd[1951]: time="2025-03-17T17:53:12.522535507Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:12.524729 containerd[1951]: time="2025-03-17T17:53:12.523780135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:2,}" Mar 17 17:53:12.525038 kubelet[2450]: I0317 17:53:12.524990 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1" Mar 17 17:53:12.525911 systemd[1]: run-netns-cni\x2d819bf9d8\x2d1559\x2d6613\x2df5d5\x2dbc9f84e4a6ae.mount: Deactivated successfully. Mar 17 17:53:12.527795 containerd[1951]: time="2025-03-17T17:53:12.526792075Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:12.527795 containerd[1951]: time="2025-03-17T17:53:12.527103895Z" level=info msg="Ensure that sandbox 669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1 in task-service has been cleanup successfully" Mar 17 17:53:12.530908 systemd[1]: run-netns-cni\x2d51786a9a\x2dc982\x2d763e\x2d45fb\x2db7b95c03f100.mount: Deactivated successfully. Mar 17 17:53:12.532904 containerd[1951]: time="2025-03-17T17:53:12.532744675Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:12.532904 containerd[1951]: time="2025-03-17T17:53:12.532789807Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:12.534208 containerd[1951]: time="2025-03-17T17:53:12.534146335Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:12.534423 containerd[1951]: time="2025-03-17T17:53:12.534321979Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:12.534423 containerd[1951]: time="2025-03-17T17:53:12.534357127Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:12.535222 containerd[1951]: time="2025-03-17T17:53:12.535178563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:2,}" Mar 17 17:53:12.670028 containerd[1951]: time="2025-03-17T17:53:12.669163760Z" level=error msg="Failed to destroy network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.670028 containerd[1951]: time="2025-03-17T17:53:12.669763808Z" level=error msg="encountered an error cleaning up failed sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.670028 containerd[1951]: time="2025-03-17T17:53:12.669870908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.670568 kubelet[2450]: E0317 17:53:12.670225 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.670568 kubelet[2450]: E0317 17:53:12.670302 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:12.670568 kubelet[2450]: E0317 17:53:12.670335 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:12.670925 kubelet[2450]: E0317 17:53:12.670410 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:12.684486 containerd[1951]: time="2025-03-17T17:53:12.684387584Z" level=error msg="Failed to destroy network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.685217 containerd[1951]: time="2025-03-17T17:53:12.685134500Z" level=error msg="encountered an error cleaning up failed sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.685319 containerd[1951]: time="2025-03-17T17:53:12.685278596Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.685705 kubelet[2450]: E0317 17:53:12.685614 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:12.685857 kubelet[2450]: E0317 17:53:12.685742 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:12.685857 kubelet[2450]: E0317 17:53:12.685775 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:12.686146 kubelet[2450]: E0317 17:53:12.685847 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:13.119120 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550-shm.mount: Deactivated successfully. Mar 17 17:53:13.273481 kubelet[2450]: E0317 17:53:13.273431 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:13.531090 kubelet[2450]: I0317 17:53:13.530131 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550" Mar 17 17:53:13.531259 containerd[1951]: time="2025-03-17T17:53:13.531219920Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:13.531818 containerd[1951]: time="2025-03-17T17:53:13.531609200Z" level=info msg="Ensure that sandbox 97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550 in task-service has been cleanup successfully" Mar 17 17:53:13.535310 systemd[1]: run-netns-cni\x2d4195869c\x2d6a78\x2d9539\x2d7268\x2d92bf1a5ac77b.mount: Deactivated successfully. Mar 17 17:53:13.536643 containerd[1951]: time="2025-03-17T17:53:13.536579000Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:13.536643 containerd[1951]: time="2025-03-17T17:53:13.536629304Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:13.539485 containerd[1951]: time="2025-03-17T17:53:13.539260520Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:13.540039 containerd[1951]: time="2025-03-17T17:53:13.539846612Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:13.540039 containerd[1951]: time="2025-03-17T17:53:13.540025196Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:13.541419 kubelet[2450]: I0317 17:53:13.540091 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9" Mar 17 17:53:13.542123 containerd[1951]: time="2025-03-17T17:53:13.542052032Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:13.542776 containerd[1951]: time="2025-03-17T17:53:13.542503868Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:13.542776 containerd[1951]: time="2025-03-17T17:53:13.542642744Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:13.542776 containerd[1951]: time="2025-03-17T17:53:13.542664596Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:13.542985 containerd[1951]: time="2025-03-17T17:53:13.542925884Z" level=info msg="Ensure that sandbox c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9 in task-service has been cleanup successfully" Mar 17 17:53:13.544044 containerd[1951]: time="2025-03-17T17:53:13.543989804Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:13.546299 containerd[1951]: time="2025-03-17T17:53:13.544032500Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:13.546299 containerd[1951]: time="2025-03-17T17:53:13.544299008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:3,}" Mar 17 17:53:13.546299 containerd[1951]: time="2025-03-17T17:53:13.545996192Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:13.546299 containerd[1951]: time="2025-03-17T17:53:13.546153008Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:13.546299 containerd[1951]: time="2025-03-17T17:53:13.546175064Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:13.548124 systemd[1]: run-netns-cni\x2d80a6a70d\x2d5c00\x2d71d4\x2da853\x2d70e045c7d1f7.mount: Deactivated successfully. Mar 17 17:53:13.550729 containerd[1951]: time="2025-03-17T17:53:13.548607212Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:13.552232 containerd[1951]: time="2025-03-17T17:53:13.551643248Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:13.552370 containerd[1951]: time="2025-03-17T17:53:13.552228824Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:13.553625 containerd[1951]: time="2025-03-17T17:53:13.553568948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:3,}" Mar 17 17:53:13.792612 containerd[1951]: time="2025-03-17T17:53:13.792318285Z" level=error msg="Failed to destroy network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.794491 containerd[1951]: time="2025-03-17T17:53:13.794313825Z" level=error msg="encountered an error cleaning up failed sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.794989 containerd[1951]: time="2025-03-17T17:53:13.794822769Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.795530 kubelet[2450]: E0317 17:53:13.795133 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.795530 kubelet[2450]: E0317 17:53:13.795212 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:13.795530 kubelet[2450]: E0317 17:53:13.795246 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:13.796142 kubelet[2450]: E0317 17:53:13.795305 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:13.832919 containerd[1951]: time="2025-03-17T17:53:13.832730398Z" level=error msg="Failed to destroy network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.834407 containerd[1951]: time="2025-03-17T17:53:13.834064930Z" level=error msg="encountered an error cleaning up failed sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.834562 containerd[1951]: time="2025-03-17T17:53:13.834489358Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.835279 kubelet[2450]: E0317 17:53:13.834850 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:13.835279 kubelet[2450]: E0317 17:53:13.834926 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:13.835279 kubelet[2450]: E0317 17:53:13.834973 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:13.835959 kubelet[2450]: E0317 17:53:13.835045 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:14.117735 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c-shm.mount: Deactivated successfully. Mar 17 17:53:14.274364 kubelet[2450]: E0317 17:53:14.273985 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:14.549187 kubelet[2450]: I0317 17:53:14.548258 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c" Mar 17 17:53:14.550810 containerd[1951]: time="2025-03-17T17:53:14.549772425Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:14.551329 containerd[1951]: time="2025-03-17T17:53:14.551123493Z" level=info msg="Ensure that sandbox ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c in task-service has been cleanup successfully" Mar 17 17:53:14.555111 systemd[1]: run-netns-cni\x2d53601782\x2d611d\x2d6400\x2dcf18\x2d738faff4d18b.mount: Deactivated successfully. Mar 17 17:53:14.557907 containerd[1951]: time="2025-03-17T17:53:14.557065413Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:14.557907 containerd[1951]: time="2025-03-17T17:53:14.557133669Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:14.570078 containerd[1951]: time="2025-03-17T17:53:14.566809989Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:14.570078 containerd[1951]: time="2025-03-17T17:53:14.567007569Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:14.570078 containerd[1951]: time="2025-03-17T17:53:14.567033813Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:14.572726 containerd[1951]: time="2025-03-17T17:53:14.572611929Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:14.572925 containerd[1951]: time="2025-03-17T17:53:14.572849697Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:14.574132 containerd[1951]: time="2025-03-17T17:53:14.572902353Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:14.576555 containerd[1951]: time="2025-03-17T17:53:14.576073989Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:14.576555 containerd[1951]: time="2025-03-17T17:53:14.576248673Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:14.576555 containerd[1951]: time="2025-03-17T17:53:14.576273813Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:14.579014 containerd[1951]: time="2025-03-17T17:53:14.578951037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:4,}" Mar 17 17:53:14.583385 kubelet[2450]: I0317 17:53:14.583338 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9" Mar 17 17:53:14.585858 containerd[1951]: time="2025-03-17T17:53:14.585211161Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:14.585858 containerd[1951]: time="2025-03-17T17:53:14.585525849Z" level=info msg="Ensure that sandbox 7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9 in task-service has been cleanup successfully" Mar 17 17:53:14.587170 containerd[1951]: time="2025-03-17T17:53:14.586328781Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:14.587569 containerd[1951]: time="2025-03-17T17:53:14.587526909Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:14.590380 containerd[1951]: time="2025-03-17T17:53:14.590329761Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:14.590665 containerd[1951]: time="2025-03-17T17:53:14.590636193Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:14.590844 containerd[1951]: time="2025-03-17T17:53:14.590816889Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:14.592109 containerd[1951]: time="2025-03-17T17:53:14.592065657Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:14.592175 systemd[1]: run-netns-cni\x2d1f70841b\x2d3175\x2d143e\x2d01d7\x2d40f3b3337293.mount: Deactivated successfully. Mar 17 17:53:14.592609 containerd[1951]: time="2025-03-17T17:53:14.592483545Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:14.593328 containerd[1951]: time="2025-03-17T17:53:14.592743813Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:14.594634 containerd[1951]: time="2025-03-17T17:53:14.594586773Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:14.595377 containerd[1951]: time="2025-03-17T17:53:14.594957441Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:14.595377 containerd[1951]: time="2025-03-17T17:53:14.594985797Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:14.597616 containerd[1951]: time="2025-03-17T17:53:14.597554313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:4,}" Mar 17 17:53:14.807912 containerd[1951]: time="2025-03-17T17:53:14.807564419Z" level=error msg="Failed to destroy network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.808715 containerd[1951]: time="2025-03-17T17:53:14.808439891Z" level=error msg="encountered an error cleaning up failed sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.808715 containerd[1951]: time="2025-03-17T17:53:14.808537811Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.808957 kubelet[2450]: E0317 17:53:14.808850 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.808957 kubelet[2450]: E0317 17:53:14.808926 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:14.809084 kubelet[2450]: E0317 17:53:14.808958 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:14.809084 kubelet[2450]: E0317 17:53:14.809019 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:14.819140 containerd[1951]: time="2025-03-17T17:53:14.818597975Z" level=error msg="Failed to destroy network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.820101 containerd[1951]: time="2025-03-17T17:53:14.819929255Z" level=error msg="encountered an error cleaning up failed sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.820101 containerd[1951]: time="2025-03-17T17:53:14.820031387Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.820983 kubelet[2450]: E0317 17:53:14.820924 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:14.821195 kubelet[2450]: E0317 17:53:14.821007 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:14.821195 kubelet[2450]: E0317 17:53:14.821042 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:14.821195 kubelet[2450]: E0317 17:53:14.821116 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:15.117197 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019-shm.mount: Deactivated successfully. Mar 17 17:53:15.274296 kubelet[2450]: E0317 17:53:15.274133 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:15.595341 kubelet[2450]: I0317 17:53:15.594181 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019" Mar 17 17:53:15.598842 containerd[1951]: time="2025-03-17T17:53:15.595298014Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:15.598842 containerd[1951]: time="2025-03-17T17:53:15.595567078Z" level=info msg="Ensure that sandbox be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019 in task-service has been cleanup successfully" Mar 17 17:53:15.598842 containerd[1951]: time="2025-03-17T17:53:15.596117806Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:15.598842 containerd[1951]: time="2025-03-17T17:53:15.596146282Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:15.599205 systemd[1]: run-netns-cni\x2dff0f9569\x2dbc6d\x2d184f\x2de5b1\x2de291b56c8ccc.mount: Deactivated successfully. Mar 17 17:53:15.602253 containerd[1951]: time="2025-03-17T17:53:15.601763350Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:15.602253 containerd[1951]: time="2025-03-17T17:53:15.601941994Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:15.602253 containerd[1951]: time="2025-03-17T17:53:15.601983802Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:15.604327 containerd[1951]: time="2025-03-17T17:53:15.603783610Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:15.604327 containerd[1951]: time="2025-03-17T17:53:15.603956806Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:15.604327 containerd[1951]: time="2025-03-17T17:53:15.603981814Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:15.605067 containerd[1951]: time="2025-03-17T17:53:15.604552450Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:15.605067 containerd[1951]: time="2025-03-17T17:53:15.604776790Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:15.605067 containerd[1951]: time="2025-03-17T17:53:15.604802146Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:15.605539 containerd[1951]: time="2025-03-17T17:53:15.605297602Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:15.605539 containerd[1951]: time="2025-03-17T17:53:15.605462086Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:15.605539 containerd[1951]: time="2025-03-17T17:53:15.605487610Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:15.606584 containerd[1951]: time="2025-03-17T17:53:15.606452794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:5,}" Mar 17 17:53:15.609648 kubelet[2450]: I0317 17:53:15.609585 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83" Mar 17 17:53:15.611743 containerd[1951]: time="2025-03-17T17:53:15.611516759Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:15.613110 containerd[1951]: time="2025-03-17T17:53:15.613045355Z" level=info msg="Ensure that sandbox 062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83 in task-service has been cleanup successfully" Mar 17 17:53:15.615851 containerd[1951]: time="2025-03-17T17:53:15.613883279Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:15.615851 containerd[1951]: time="2025-03-17T17:53:15.613931855Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.617329379Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.617506319Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.617529419Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.618376007Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.618524075Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.618546863Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.619086539Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:15.619236 containerd[1951]: time="2025-03-17T17:53:15.619239203Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:15.618579 systemd[1]: run-netns-cni\x2dbe9d1db8\x2d919a\x2d98e8\x2da145\x2d61455a9324c8.mount: Deactivated successfully. Mar 17 17:53:15.619902 containerd[1951]: time="2025-03-17T17:53:15.619262855Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:15.620999 containerd[1951]: time="2025-03-17T17:53:15.620172995Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:15.620999 containerd[1951]: time="2025-03-17T17:53:15.620453687Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:15.620999 containerd[1951]: time="2025-03-17T17:53:15.620479247Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:15.624795 containerd[1951]: time="2025-03-17T17:53:15.624244427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:5,}" Mar 17 17:53:15.804715 containerd[1951]: time="2025-03-17T17:53:15.804219635Z" level=error msg="Failed to destroy network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.804923 containerd[1951]: time="2025-03-17T17:53:15.804866699Z" level=error msg="encountered an error cleaning up failed sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.805021 containerd[1951]: time="2025-03-17T17:53:15.804971279Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.805389 kubelet[2450]: E0317 17:53:15.805319 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.805496 kubelet[2450]: E0317 17:53:15.805402 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:15.805496 kubelet[2450]: E0317 17:53:15.805438 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:15.805613 kubelet[2450]: E0317 17:53:15.805507 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:15.847818 containerd[1951]: time="2025-03-17T17:53:15.847352748Z" level=error msg="Failed to destroy network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.848291 containerd[1951]: time="2025-03-17T17:53:15.848241156Z" level=error msg="encountered an error cleaning up failed sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.848548 containerd[1951]: time="2025-03-17T17:53:15.848473956Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.848929 kubelet[2450]: E0317 17:53:15.848880 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:15.849354 kubelet[2450]: E0317 17:53:15.849137 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:15.849354 kubelet[2450]: E0317 17:53:15.849205 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:15.849672 kubelet[2450]: E0317 17:53:15.849302 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:16.117771 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7-shm.mount: Deactivated successfully. Mar 17 17:53:16.274512 kubelet[2450]: E0317 17:53:16.274292 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:16.624122 kubelet[2450]: I0317 17:53:16.624056 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7" Mar 17 17:53:16.626410 containerd[1951]: time="2025-03-17T17:53:16.625082640Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:16.626410 containerd[1951]: time="2025-03-17T17:53:16.625401780Z" level=info msg="Ensure that sandbox bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7 in task-service has been cleanup successfully" Mar 17 17:53:16.630460 systemd[1]: run-netns-cni\x2df6c2625a\x2da89e\x2d94ec\x2d7990\x2dd65c5d1c7114.mount: Deactivated successfully. Mar 17 17:53:16.632779 containerd[1951]: time="2025-03-17T17:53:16.631030248Z" level=info msg="TearDown network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" successfully" Mar 17 17:53:16.632779 containerd[1951]: time="2025-03-17T17:53:16.631094088Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" returns successfully" Mar 17 17:53:16.633203 containerd[1951]: time="2025-03-17T17:53:16.632899476Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:16.634931 containerd[1951]: time="2025-03-17T17:53:16.634318836Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:16.634931 containerd[1951]: time="2025-03-17T17:53:16.634427640Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:16.636113 containerd[1951]: time="2025-03-17T17:53:16.636056964Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:16.636264 containerd[1951]: time="2025-03-17T17:53:16.636223140Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:16.636363 containerd[1951]: time="2025-03-17T17:53:16.636255852Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:16.637592 containerd[1951]: time="2025-03-17T17:53:16.637532520Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:16.638846 containerd[1951]: time="2025-03-17T17:53:16.638791680Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:16.638846 containerd[1951]: time="2025-03-17T17:53:16.638839092Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:16.639552 containerd[1951]: time="2025-03-17T17:53:16.639417984Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:16.639648 containerd[1951]: time="2025-03-17T17:53:16.639586140Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:16.639648 containerd[1951]: time="2025-03-17T17:53:16.639608988Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:16.640422 containerd[1951]: time="2025-03-17T17:53:16.640295124Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:16.641286 containerd[1951]: time="2025-03-17T17:53:16.641096280Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:16.641286 containerd[1951]: time="2025-03-17T17:53:16.641173596Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:16.641980 kubelet[2450]: I0317 17:53:16.641919 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc" Mar 17 17:53:16.645282 containerd[1951]: time="2025-03-17T17:53:16.645157152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:6,}" Mar 17 17:53:16.646488 containerd[1951]: time="2025-03-17T17:53:16.646005804Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:16.646488 containerd[1951]: time="2025-03-17T17:53:16.646285836Z" level=info msg="Ensure that sandbox 685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc in task-service has been cleanup successfully" Mar 17 17:53:16.649885 containerd[1951]: time="2025-03-17T17:53:16.649833396Z" level=info msg="TearDown network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" successfully" Mar 17 17:53:16.650123 containerd[1951]: time="2025-03-17T17:53:16.650091204Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" returns successfully" Mar 17 17:53:16.651485 systemd[1]: run-netns-cni\x2dd465f978\x2d7d3a\x2d8e91\x2da187\x2db85fbb65e018.mount: Deactivated successfully. Mar 17 17:53:16.653085 containerd[1951]: time="2025-03-17T17:53:16.651549636Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:16.653085 containerd[1951]: time="2025-03-17T17:53:16.652010376Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:16.653085 containerd[1951]: time="2025-03-17T17:53:16.652037556Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:16.654643 containerd[1951]: time="2025-03-17T17:53:16.654273492Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:16.654643 containerd[1951]: time="2025-03-17T17:53:16.654442476Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:16.654643 containerd[1951]: time="2025-03-17T17:53:16.654464784Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:16.657147 containerd[1951]: time="2025-03-17T17:53:16.657084600Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:16.657328 containerd[1951]: time="2025-03-17T17:53:16.657294060Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:16.657436 containerd[1951]: time="2025-03-17T17:53:16.657327324Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:16.659137 containerd[1951]: time="2025-03-17T17:53:16.659081376Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:16.660282 containerd[1951]: time="2025-03-17T17:53:16.659486940Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:16.660282 containerd[1951]: time="2025-03-17T17:53:16.659537436Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:16.660979 containerd[1951]: time="2025-03-17T17:53:16.660636672Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:16.660979 containerd[1951]: time="2025-03-17T17:53:16.660811920Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:16.660979 containerd[1951]: time="2025-03-17T17:53:16.660834732Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:16.661907 containerd[1951]: time="2025-03-17T17:53:16.661770672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:6,}" Mar 17 17:53:16.862338 containerd[1951]: time="2025-03-17T17:53:16.862261717Z" level=error msg="Failed to destroy network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.863172 containerd[1951]: time="2025-03-17T17:53:16.863104813Z" level=error msg="encountered an error cleaning up failed sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.863294 containerd[1951]: time="2025-03-17T17:53:16.863212861Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.863822 kubelet[2450]: E0317 17:53:16.863583 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.863822 kubelet[2450]: E0317 17:53:16.863681 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:16.863822 kubelet[2450]: E0317 17:53:16.863740 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:16.864460 kubelet[2450]: E0317 17:53:16.863816 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:16.886594 containerd[1951]: time="2025-03-17T17:53:16.883928509Z" level=error msg="Failed to destroy network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.886729 containerd[1951]: time="2025-03-17T17:53:16.886615945Z" level=error msg="encountered an error cleaning up failed sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.886874 containerd[1951]: time="2025-03-17T17:53:16.886802893Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.887586 kubelet[2450]: E0317 17:53:16.887362 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:16.887586 kubelet[2450]: E0317 17:53:16.887445 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:16.887586 kubelet[2450]: E0317 17:53:16.887477 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:16.887845 kubelet[2450]: E0317 17:53:16.887604 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:17.116760 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc-shm.mount: Deactivated successfully. Mar 17 17:53:17.117339 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a-shm.mount: Deactivated successfully. Mar 17 17:53:17.261748 kubelet[2450]: E0317 17:53:17.261606 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:17.276842 kubelet[2450]: E0317 17:53:17.276789 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:17.658373 kubelet[2450]: I0317 17:53:17.658025 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a" Mar 17 17:53:17.662716 containerd[1951]: time="2025-03-17T17:53:17.660180217Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" Mar 17 17:53:17.662716 containerd[1951]: time="2025-03-17T17:53:17.660544141Z" level=info msg="Ensure that sandbox acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a in task-service has been cleanup successfully" Mar 17 17:53:17.664827 containerd[1951]: time="2025-03-17T17:53:17.663356077Z" level=info msg="TearDown network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" successfully" Mar 17 17:53:17.664827 containerd[1951]: time="2025-03-17T17:53:17.663448069Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" returns successfully" Mar 17 17:53:17.664209 systemd[1]: run-netns-cni\x2db649c9b0\x2dc1bb\x2d2409\x2da807\x2df1096d9058a2.mount: Deactivated successfully. Mar 17 17:53:17.667009 containerd[1951]: time="2025-03-17T17:53:17.666575161Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:17.667009 containerd[1951]: time="2025-03-17T17:53:17.666839641Z" level=info msg="TearDown network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" successfully" Mar 17 17:53:17.667009 containerd[1951]: time="2025-03-17T17:53:17.666885613Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" returns successfully" Mar 17 17:53:17.672415 containerd[1951]: time="2025-03-17T17:53:17.672277669Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:17.672709 containerd[1951]: time="2025-03-17T17:53:17.672535297Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:17.672709 containerd[1951]: time="2025-03-17T17:53:17.672589009Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:17.674801 containerd[1951]: time="2025-03-17T17:53:17.674645701Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:17.675562 containerd[1951]: time="2025-03-17T17:53:17.675414025Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:17.675562 containerd[1951]: time="2025-03-17T17:53:17.675496573Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:17.676837 containerd[1951]: time="2025-03-17T17:53:17.676789597Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:17.677014 containerd[1951]: time="2025-03-17T17:53:17.676977901Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:17.677096 containerd[1951]: time="2025-03-17T17:53:17.677013685Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:17.679723 containerd[1951]: time="2025-03-17T17:53:17.678296545Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:17.679723 containerd[1951]: time="2025-03-17T17:53:17.678490657Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:17.679723 containerd[1951]: time="2025-03-17T17:53:17.678516973Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:17.680891 containerd[1951]: time="2025-03-17T17:53:17.680822749Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:17.681064 containerd[1951]: time="2025-03-17T17:53:17.681023437Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:17.681140 containerd[1951]: time="2025-03-17T17:53:17.681061021Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:17.684165 containerd[1951]: time="2025-03-17T17:53:17.684080041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:7,}" Mar 17 17:53:17.690502 kubelet[2450]: I0317 17:53:17.690451 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc" Mar 17 17:53:17.693593 containerd[1951]: time="2025-03-17T17:53:17.693540385Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" Mar 17 17:53:17.694083 containerd[1951]: time="2025-03-17T17:53:17.694045909Z" level=info msg="Ensure that sandbox 52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc in task-service has been cleanup successfully" Mar 17 17:53:17.694770 containerd[1951]: time="2025-03-17T17:53:17.694734025Z" level=info msg="TearDown network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" successfully" Mar 17 17:53:17.696785 containerd[1951]: time="2025-03-17T17:53:17.696736021Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" returns successfully" Mar 17 17:53:17.698477 systemd[1]: run-netns-cni\x2df4b5d799\x2d640a\x2d5ef2\x2de57d\x2df9519a9a07a4.mount: Deactivated successfully. Mar 17 17:53:17.699656 containerd[1951]: time="2025-03-17T17:53:17.699588073Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:17.700214 containerd[1951]: time="2025-03-17T17:53:17.699942097Z" level=info msg="TearDown network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" successfully" Mar 17 17:53:17.700214 containerd[1951]: time="2025-03-17T17:53:17.699979417Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" returns successfully" Mar 17 17:53:17.701641 containerd[1951]: time="2025-03-17T17:53:17.701558461Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:17.702459 containerd[1951]: time="2025-03-17T17:53:17.702399445Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:17.702459 containerd[1951]: time="2025-03-17T17:53:17.702443113Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:17.705016 containerd[1951]: time="2025-03-17T17:53:17.704908261Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:17.705918 containerd[1951]: time="2025-03-17T17:53:17.705834457Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:17.706176 containerd[1951]: time="2025-03-17T17:53:17.706091149Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:17.707300 containerd[1951]: time="2025-03-17T17:53:17.707240305Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:17.707562 containerd[1951]: time="2025-03-17T17:53:17.707414833Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:17.707562 containerd[1951]: time="2025-03-17T17:53:17.707446777Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:17.708786 containerd[1951]: time="2025-03-17T17:53:17.708627601Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:17.709303 containerd[1951]: time="2025-03-17T17:53:17.709010401Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:17.709303 containerd[1951]: time="2025-03-17T17:53:17.709211293Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:17.710878 containerd[1951]: time="2025-03-17T17:53:17.710581825Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:17.711574 containerd[1951]: time="2025-03-17T17:53:17.711516241Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:17.711656 containerd[1951]: time="2025-03-17T17:53:17.711598729Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:17.712825 containerd[1951]: time="2025-03-17T17:53:17.712762813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:7,}" Mar 17 17:53:17.898754 containerd[1951]: time="2025-03-17T17:53:17.898526498Z" level=error msg="Failed to destroy network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.901264 containerd[1951]: time="2025-03-17T17:53:17.901074254Z" level=error msg="encountered an error cleaning up failed sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.901473 containerd[1951]: time="2025-03-17T17:53:17.901423202Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.902984 kubelet[2450]: E0317 17:53:17.902388 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.902984 kubelet[2450]: E0317 17:53:17.902462 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:17.902984 kubelet[2450]: E0317 17:53:17.902507 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-s8xfx" Mar 17 17:53:17.903244 kubelet[2450]: E0317 17:53:17.902566 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-s8xfx_calico-system(8a712d71-7427-4200-94a3-b1d8ea1fe150)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-s8xfx" podUID="8a712d71-7427-4200-94a3-b1d8ea1fe150" Mar 17 17:53:17.907274 containerd[1951]: time="2025-03-17T17:53:17.907203722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:17.909022 containerd[1951]: time="2025-03-17T17:53:17.908819270Z" level=error msg="Failed to destroy network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.910808 containerd[1951]: time="2025-03-17T17:53:17.910159418Z" level=error msg="encountered an error cleaning up failed sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.910808 containerd[1951]: time="2025-03-17T17:53:17.910258406Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.910808 containerd[1951]: time="2025-03-17T17:53:17.910461266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 17 17:53:17.911742 kubelet[2450]: E0317 17:53:17.911386 2450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:53:17.911742 kubelet[2450]: E0317 17:53:17.911457 2450 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:17.911742 kubelet[2450]: E0317 17:53:17.911503 2450 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-8qxmr" Mar 17 17:53:17.911974 kubelet[2450]: E0317 17:53:17.911575 2450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-8qxmr_default(853570a2-8489-4178-8082-afac70eb226b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-8qxmr" podUID="853570a2-8489-4178-8082-afac70eb226b" Mar 17 17:53:17.912542 containerd[1951]: time="2025-03-17T17:53:17.912297638Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:17.917737 containerd[1951]: time="2025-03-17T17:53:17.917179262Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:17.918853 containerd[1951]: time="2025-03-17T17:53:17.918782246Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 6.399369344s" Mar 17 17:53:17.919078 containerd[1951]: time="2025-03-17T17:53:17.918840134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 17 17:53:17.931939 containerd[1951]: time="2025-03-17T17:53:17.931640390Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:53:17.951426 containerd[1951]: time="2025-03-17T17:53:17.951034526Z" level=info msg="CreateContainer within sandbox \"8c28c10517df7e6ad9c36d4f7f578f8520e98314a9083d71fbbe5cf932395a6f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6\"" Mar 17 17:53:17.952408 containerd[1951]: time="2025-03-17T17:53:17.952354610Z" level=info msg="StartContainer for \"127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6\"" Mar 17 17:53:17.996987 systemd[1]: Started cri-containerd-127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6.scope - libcontainer container 127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6. Mar 17 17:53:18.053432 containerd[1951]: time="2025-03-17T17:53:18.052505027Z" level=info msg="StartContainer for \"127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6\" returns successfully" Mar 17 17:53:18.123346 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee-shm.mount: Deactivated successfully. Mar 17 17:53:18.123539 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3020712912.mount: Deactivated successfully. Mar 17 17:53:18.181911 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:53:18.182210 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:53:18.277561 kubelet[2450]: E0317 17:53:18.277494 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:18.706207 kubelet[2450]: I0317 17:53:18.705516 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee" Mar 17 17:53:18.708170 containerd[1951]: time="2025-03-17T17:53:18.707049494Z" level=info msg="StopPodSandbox for \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\"" Mar 17 17:53:18.708170 containerd[1951]: time="2025-03-17T17:53:18.707335298Z" level=info msg="Ensure that sandbox 01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee in task-service has been cleanup successfully" Mar 17 17:53:18.709188 containerd[1951]: time="2025-03-17T17:53:18.708997034Z" level=info msg="TearDown network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" successfully" Mar 17 17:53:18.709188 containerd[1951]: time="2025-03-17T17:53:18.709040366Z" level=info msg="StopPodSandbox for \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" returns successfully" Mar 17 17:53:18.711529 systemd[1]: run-netns-cni\x2dbe32a65d\x2d2697\x2d3a94\x2d0ae9\x2d5d37338090a0.mount: Deactivated successfully. Mar 17 17:53:18.712851 containerd[1951]: time="2025-03-17T17:53:18.711845090Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" Mar 17 17:53:18.712851 containerd[1951]: time="2025-03-17T17:53:18.712003022Z" level=info msg="TearDown network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" successfully" Mar 17 17:53:18.712851 containerd[1951]: time="2025-03-17T17:53:18.712024994Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" returns successfully" Mar 17 17:53:18.715772 containerd[1951]: time="2025-03-17T17:53:18.715363106Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:18.715991 containerd[1951]: time="2025-03-17T17:53:18.715838858Z" level=info msg="TearDown network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" successfully" Mar 17 17:53:18.716070 containerd[1951]: time="2025-03-17T17:53:18.715988222Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" returns successfully" Mar 17 17:53:18.717159 containerd[1951]: time="2025-03-17T17:53:18.717109682Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:18.717312 containerd[1951]: time="2025-03-17T17:53:18.717278510Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:18.717372 containerd[1951]: time="2025-03-17T17:53:18.717310142Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:18.718321 containerd[1951]: time="2025-03-17T17:53:18.718264094Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:18.718455 containerd[1951]: time="2025-03-17T17:53:18.718420934Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:18.718530 containerd[1951]: time="2025-03-17T17:53:18.718451906Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:18.719777 containerd[1951]: time="2025-03-17T17:53:18.719735366Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:18.719918 containerd[1951]: time="2025-03-17T17:53:18.719883830Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:18.719992 containerd[1951]: time="2025-03-17T17:53:18.719915786Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:18.722021 containerd[1951]: time="2025-03-17T17:53:18.721845410Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:18.723830 containerd[1951]: time="2025-03-17T17:53:18.722219714Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:18.723830 containerd[1951]: time="2025-03-17T17:53:18.722273582Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:18.727012 containerd[1951]: time="2025-03-17T17:53:18.726956606Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:18.727336 containerd[1951]: time="2025-03-17T17:53:18.727304006Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:18.727490 containerd[1951]: time="2025-03-17T17:53:18.727448570Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:18.728891 kubelet[2450]: I0317 17:53:18.728847 2450 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76" Mar 17 17:53:18.734147 containerd[1951]: time="2025-03-17T17:53:18.734067614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:8,}" Mar 17 17:53:18.734744 containerd[1951]: time="2025-03-17T17:53:18.734665058Z" level=info msg="StopPodSandbox for \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\"" Mar 17 17:53:18.735083 containerd[1951]: time="2025-03-17T17:53:18.734982158Z" level=info msg="Ensure that sandbox 623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76 in task-service has been cleanup successfully" Mar 17 17:53:18.735754 containerd[1951]: time="2025-03-17T17:53:18.735264878Z" level=info msg="TearDown network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" successfully" Mar 17 17:53:18.735754 containerd[1951]: time="2025-03-17T17:53:18.735301622Z" level=info msg="StopPodSandbox for \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" returns successfully" Mar 17 17:53:18.739012 systemd[1]: run-netns-cni\x2d0fb9d46c\x2de4d0\x2d1b67\x2d6a66\x2dbfc5cffa51f2.mount: Deactivated successfully. Mar 17 17:53:18.740340 containerd[1951]: time="2025-03-17T17:53:18.740276750Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" Mar 17 17:53:18.741988 containerd[1951]: time="2025-03-17T17:53:18.740445902Z" level=info msg="TearDown network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" successfully" Mar 17 17:53:18.741988 containerd[1951]: time="2025-03-17T17:53:18.740884334Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" returns successfully" Mar 17 17:53:18.741988 containerd[1951]: time="2025-03-17T17:53:18.741633194Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:18.741988 containerd[1951]: time="2025-03-17T17:53:18.741813842Z" level=info msg="TearDown network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" successfully" Mar 17 17:53:18.741988 containerd[1951]: time="2025-03-17T17:53:18.741838010Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" returns successfully" Mar 17 17:53:18.742804 containerd[1951]: time="2025-03-17T17:53:18.742311998Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:18.742804 containerd[1951]: time="2025-03-17T17:53:18.742476554Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:18.742804 containerd[1951]: time="2025-03-17T17:53:18.742499666Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:18.743650 containerd[1951]: time="2025-03-17T17:53:18.743607350Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:18.743827 containerd[1951]: time="2025-03-17T17:53:18.743790014Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:18.743888 containerd[1951]: time="2025-03-17T17:53:18.743823866Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:18.744481 containerd[1951]: time="2025-03-17T17:53:18.744423662Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:18.744610 containerd[1951]: time="2025-03-17T17:53:18.744578510Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:18.744727 containerd[1951]: time="2025-03-17T17:53:18.744609374Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:18.745282 containerd[1951]: time="2025-03-17T17:53:18.745206674Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:18.745434 containerd[1951]: time="2025-03-17T17:53:18.745401986Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:18.745434 containerd[1951]: time="2025-03-17T17:53:18.745442222Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:18.746254 containerd[1951]: time="2025-03-17T17:53:18.746196602Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:18.746394 containerd[1951]: time="2025-03-17T17:53:18.746361134Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:18.747367 containerd[1951]: time="2025-03-17T17:53:18.746393942Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:18.747367 containerd[1951]: time="2025-03-17T17:53:18.747150914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:8,}" Mar 17 17:53:18.978048 (udev-worker)[3515]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:53:18.979678 systemd-networkd[1855]: cali74115b111a5: Link UP Mar 17 17:53:18.980566 systemd-networkd[1855]: cali74115b111a5: Gained carrier Mar 17 17:53:18.995273 kubelet[2450]: I0317 17:53:18.995184 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5sxs8" podStartSLOduration=4.98417941 podStartE2EDuration="21.995162151s" podCreationTimestamp="2025-03-17 17:52:57 +0000 UTC" firstStartedPulling="2025-03-17 17:53:00.909626541 +0000 UTC m=+5.470252660" lastFinishedPulling="2025-03-17 17:53:17.920609294 +0000 UTC m=+22.481235401" observedRunningTime="2025-03-17 17:53:18.725893322 +0000 UTC m=+23.286519465" watchObservedRunningTime="2025-03-17 17:53:18.995162151 +0000 UTC m=+23.555788270" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.817 [INFO][3540] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.845 [INFO][3540] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0 nginx-deployment-7fcdb87857- default 853570a2-8489-4178-8082-afac70eb226b 1140 0 2025-03-17 17:53:10 +0000 UTC map[app:nginx pod-template-hash:7fcdb87857 projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 172.31.27.21 nginx-deployment-7fcdb87857-8qxmr eth0 default [] [] [kns.default ksa.default.default] cali74115b111a5 [] []}} ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.845 [INFO][3540] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.902 [INFO][3569] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" HandleID="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Workload="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.919 [INFO][3569] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" HandleID="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Workload="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c100), Attrs:map[string]string{"namespace":"default", "node":"172.31.27.21", "pod":"nginx-deployment-7fcdb87857-8qxmr", "timestamp":"2025-03-17 17:53:18.902165079 +0000 UTC"}, Hostname:"172.31.27.21", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.919 [INFO][3569] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.919 [INFO][3569] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.919 [INFO][3569] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.27.21' Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.923 [INFO][3569] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.932 [INFO][3569] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.938 [INFO][3569] ipam/ipam.go 489: Trying affinity for 192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.941 [INFO][3569] ipam/ipam.go 155: Attempting to load block cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.944 [INFO][3569] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.944 [INFO][3569] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.947 [INFO][3569] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93 Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.953 [INFO][3569] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3569] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.110.193/26] block=192.168.110.192/26 handle="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3569] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.110.193/26] handle="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" host="172.31.27.21" Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3569] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:53:18.999567 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3569] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.193/26] IPv6=[] ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" HandleID="k8s-pod-network.b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Workload="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.969 [INFO][3540] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0", GenerateName:"nginx-deployment-7fcdb87857-", Namespace:"default", SelfLink:"", UID:"853570a2-8489-4178-8082-afac70eb226b", ResourceVersion:"1140", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"7fcdb87857", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"", Pod:"nginx-deployment-7fcdb87857-8qxmr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.110.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali74115b111a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.969 [INFO][3540] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.110.193/32] ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.970 [INFO][3540] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali74115b111a5 ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.980 [INFO][3540] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.981 [INFO][3540] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0", GenerateName:"nginx-deployment-7fcdb87857-", Namespace:"default", SelfLink:"", UID:"853570a2-8489-4178-8082-afac70eb226b", ResourceVersion:"1140", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"7fcdb87857", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93", Pod:"nginx-deployment-7fcdb87857-8qxmr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.110.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali74115b111a5", MAC:"7a:95:a5:4e:01:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:19.000855 containerd[1951]: 2025-03-17 17:53:18.995 [INFO][3540] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93" Namespace="default" Pod="nginx-deployment-7fcdb87857-8qxmr" WorkloadEndpoint="172.31.27.21-k8s-nginx--deployment--7fcdb87857--8qxmr-eth0" Mar 17 17:53:19.035262 containerd[1951]: time="2025-03-17T17:53:19.035035200Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:53:19.035262 containerd[1951]: time="2025-03-17T17:53:19.035127576Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:53:19.035262 containerd[1951]: time="2025-03-17T17:53:19.035153808Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:19.035729 containerd[1951]: time="2025-03-17T17:53:19.035283288Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:19.066395 systemd[1]: Started cri-containerd-b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93.scope - libcontainer container b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93. Mar 17 17:53:19.086019 systemd-networkd[1855]: cali4d1338a94f3: Link UP Mar 17 17:53:19.087799 systemd-networkd[1855]: cali4d1338a94f3: Gained carrier Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.815 [INFO][3549] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.839 [INFO][3549] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.27.21-k8s-csi--node--driver--s8xfx-eth0 csi-node-driver- calico-system 8a712d71-7427-4200-94a3-b1d8ea1fe150 1036 0 2025-03-17 17:52:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 172.31.27.21 csi-node-driver-s8xfx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4d1338a94f3 [] []}} ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.839 [INFO][3549] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.903 [INFO][3564] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" HandleID="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Workload="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.926 [INFO][3564] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" HandleID="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Workload="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003335c0), Attrs:map[string]string{"namespace":"calico-system", "node":"172.31.27.21", "pod":"csi-node-driver-s8xfx", "timestamp":"2025-03-17 17:53:18.903749907 +0000 UTC"}, Hostname:"172.31.27.21", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.926 [INFO][3564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:18.964 [INFO][3564] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.27.21' Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.024 [INFO][3564] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.034 [INFO][3564] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.043 [INFO][3564] ipam/ipam.go 489: Trying affinity for 192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.046 [INFO][3564] ipam/ipam.go 155: Attempting to load block cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.049 [INFO][3564] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.050 [INFO][3564] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.053 [INFO][3564] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.062 [INFO][3564] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.076 [INFO][3564] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.110.194/26] block=192.168.110.192/26 handle="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.076 [INFO][3564] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.110.194/26] handle="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" host="172.31.27.21" Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.076 [INFO][3564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:53:19.110012 containerd[1951]: 2025-03-17 17:53:19.076 [INFO][3564] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.194/26] IPv6=[] ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" HandleID="k8s-pod-network.63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Workload="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.080 [INFO][3549] cni-plugin/k8s.go 386: Populated endpoint ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-csi--node--driver--s8xfx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8a712d71-7427-4200-94a3-b1d8ea1fe150", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 52, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"", Pod:"csi-node-driver-s8xfx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.110.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4d1338a94f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.080 [INFO][3549] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.110.194/32] ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.080 [INFO][3549] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4d1338a94f3 ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.089 [INFO][3549] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.089 [INFO][3549] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-csi--node--driver--s8xfx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8a712d71-7427-4200-94a3-b1d8ea1fe150", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 52, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e", Pod:"csi-node-driver-s8xfx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.110.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4d1338a94f3", MAC:"4e:3a:94:92:4b:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:19.111180 containerd[1951]: 2025-03-17 17:53:19.106 [INFO][3549] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e" Namespace="calico-system" Pod="csi-node-driver-s8xfx" WorkloadEndpoint="172.31.27.21-k8s-csi--node--driver--s8xfx-eth0" Mar 17 17:53:19.158428 containerd[1951]: time="2025-03-17T17:53:19.158133816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-8qxmr,Uid:853570a2-8489-4178-8082-afac70eb226b,Namespace:default,Attempt:8,} returns sandbox id \"b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93\"" Mar 17 17:53:19.160661 containerd[1951]: time="2025-03-17T17:53:19.160339296Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Mar 17 17:53:19.170226 containerd[1951]: time="2025-03-17T17:53:19.169665480Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:53:19.170226 containerd[1951]: time="2025-03-17T17:53:19.169810500Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:53:19.170226 containerd[1951]: time="2025-03-17T17:53:19.169847496Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:19.170226 containerd[1951]: time="2025-03-17T17:53:19.170008704Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:19.213994 systemd[1]: Started cri-containerd-63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e.scope - libcontainer container 63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e. Mar 17 17:53:19.253440 containerd[1951]: time="2025-03-17T17:53:19.253296769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-s8xfx,Uid:8a712d71-7427-4200-94a3-b1d8ea1fe150,Namespace:calico-system,Attempt:8,} returns sandbox id \"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e\"" Mar 17 17:53:19.277699 kubelet[2450]: E0317 17:53:19.277631 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:20.049831 kernel: bpftool[3802]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:53:20.279297 kubelet[2450]: E0317 17:53:20.278727 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:20.416552 (udev-worker)[3514]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:53:20.428815 systemd-networkd[1855]: vxlan.calico: Link UP Mar 17 17:53:20.428830 systemd-networkd[1855]: vxlan.calico: Gained carrier Mar 17 17:53:20.628903 systemd-networkd[1855]: cali74115b111a5: Gained IPv6LL Mar 17 17:53:20.947866 systemd-networkd[1855]: cali4d1338a94f3: Gained IPv6LL Mar 17 17:53:21.279097 kubelet[2450]: E0317 17:53:21.279020 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:22.034977 systemd-networkd[1855]: vxlan.calico: Gained IPv6LL Mar 17 17:53:22.079713 update_engine[1934]: I20250317 17:53:22.078730 1934 update_attempter.cc:509] Updating boot flags... Mar 17 17:53:22.201784 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3514) Mar 17 17:53:22.279902 kubelet[2450]: E0317 17:53:22.279311 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:22.641754 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3891) Mar 17 17:53:23.056739 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3891) Mar 17 17:53:23.202875 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1337632264.mount: Deactivated successfully. Mar 17 17:53:23.281200 kubelet[2450]: E0317 17:53:23.280186 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:24.280695 kubelet[2450]: E0317 17:53:24.280622 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:24.596622 ntpd[1927]: Listen normally on 8 vxlan.calico 192.168.110.192:123 Mar 17 17:53:24.597433 ntpd[1927]: 17 Mar 17:53:24 ntpd[1927]: Listen normally on 8 vxlan.calico 192.168.110.192:123 Mar 17 17:53:24.597997 ntpd[1927]: Listen normally on 9 cali74115b111a5 [fe80::ecee:eeff:feee:eeee%3]:123 Mar 17 17:53:24.598419 ntpd[1927]: 17 Mar 17:53:24 ntpd[1927]: Listen normally on 9 cali74115b111a5 [fe80::ecee:eeff:feee:eeee%3]:123 Mar 17 17:53:24.598419 ntpd[1927]: 17 Mar 17:53:24 ntpd[1927]: Listen normally on 10 cali4d1338a94f3 [fe80::ecee:eeff:feee:eeee%4]:123 Mar 17 17:53:24.598419 ntpd[1927]: 17 Mar 17:53:24 ntpd[1927]: Listen normally on 11 vxlan.calico [fe80::64d7:fcff:fefe:f34e%5]:123 Mar 17 17:53:24.598088 ntpd[1927]: Listen normally on 10 cali4d1338a94f3 [fe80::ecee:eeff:feee:eeee%4]:123 Mar 17 17:53:24.598176 ntpd[1927]: Listen normally on 11 vxlan.calico [fe80::64d7:fcff:fefe:f34e%5]:123 Mar 17 17:53:24.809141 containerd[1951]: time="2025-03-17T17:53:24.809055716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:24.810829 containerd[1951]: time="2025-03-17T17:53:24.810763616Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=69703867" Mar 17 17:53:24.811305 containerd[1951]: time="2025-03-17T17:53:24.811266080Z" level=info msg="ImageCreate event name:\"sha256:f660a383148a8217a75a455efeb8bfd4cbe3afa737712cc0e25f27c03b770dd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:24.818279 containerd[1951]: time="2025-03-17T17:53:24.818175368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:24.820747 containerd[1951]: time="2025-03-17T17:53:24.820149164Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:f660a383148a8217a75a455efeb8bfd4cbe3afa737712cc0e25f27c03b770dd4\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\", size \"69703745\" in 5.659751548s" Mar 17 17:53:24.820747 containerd[1951]: time="2025-03-17T17:53:24.820207544Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:f660a383148a8217a75a455efeb8bfd4cbe3afa737712cc0e25f27c03b770dd4\"" Mar 17 17:53:24.823481 containerd[1951]: time="2025-03-17T17:53:24.823031300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:53:24.824388 containerd[1951]: time="2025-03-17T17:53:24.824329508Z" level=info msg="CreateContainer within sandbox \"b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Mar 17 17:53:24.847627 containerd[1951]: time="2025-03-17T17:53:24.847500140Z" level=info msg="CreateContainer within sandbox \"b947c00a0976d5218f6a1c652db4b95d2752d8646f2f021b42cf72b7d6bfaa93\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"04ccefa4c712f39f6fd830faeef5e56fca8923194e17e5fe093b6a9ac334281e\"" Mar 17 17:53:24.849227 containerd[1951]: time="2025-03-17T17:53:24.849030044Z" level=info msg="StartContainer for \"04ccefa4c712f39f6fd830faeef5e56fca8923194e17e5fe093b6a9ac334281e\"" Mar 17 17:53:24.903005 systemd[1]: Started cri-containerd-04ccefa4c712f39f6fd830faeef5e56fca8923194e17e5fe093b6a9ac334281e.scope - libcontainer container 04ccefa4c712f39f6fd830faeef5e56fca8923194e17e5fe093b6a9ac334281e. Mar 17 17:53:24.946041 containerd[1951]: time="2025-03-17T17:53:24.945937641Z" level=info msg="StartContainer for \"04ccefa4c712f39f6fd830faeef5e56fca8923194e17e5fe093b6a9ac334281e\" returns successfully" Mar 17 17:53:25.281122 kubelet[2450]: E0317 17:53:25.281060 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:25.696915 kubelet[2450]: I0317 17:53:25.696592 2450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:53:25.840395 systemd[1]: run-containerd-runc-k8s.io-127b55df3050e78d16488e70326162c0ae9343391ef2184d737c57b9f0b41fa6-runc.QNY8fR.mount: Deactivated successfully. Mar 17 17:53:25.843776 kubelet[2450]: I0317 17:53:25.843584 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-7fcdb87857-8qxmr" podStartSLOduration=10.181142521 podStartE2EDuration="15.843563145s" podCreationTimestamp="2025-03-17 17:53:10 +0000 UTC" firstStartedPulling="2025-03-17 17:53:19.15980286 +0000 UTC m=+23.720428979" lastFinishedPulling="2025-03-17 17:53:24.822223472 +0000 UTC m=+29.382849603" observedRunningTime="2025-03-17 17:53:25.809959257 +0000 UTC m=+30.370585412" watchObservedRunningTime="2025-03-17 17:53:25.843563145 +0000 UTC m=+30.404189252" Mar 17 17:53:26.282231 kubelet[2450]: E0317 17:53:26.282143 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:26.340990 containerd[1951]: time="2025-03-17T17:53:26.340616000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:26.342113 containerd[1951]: time="2025-03-17T17:53:26.342011840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 17 17:53:26.343652 containerd[1951]: time="2025-03-17T17:53:26.343175216Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:26.346746 containerd[1951]: time="2025-03-17T17:53:26.346676540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:26.348085 containerd[1951]: time="2025-03-17T17:53:26.348045092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.524961136s" Mar 17 17:53:26.348252 containerd[1951]: time="2025-03-17T17:53:26.348222620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 17 17:53:26.351534 containerd[1951]: time="2025-03-17T17:53:26.351434384Z" level=info msg="CreateContainer within sandbox \"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:53:26.373647 containerd[1951]: time="2025-03-17T17:53:26.373573256Z" level=info msg="CreateContainer within sandbox \"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7dcbb78cc9daab5f76c3384a43aab3fb608c79599e8c4da7cf073c13288c0d32\"" Mar 17 17:53:26.376757 containerd[1951]: time="2025-03-17T17:53:26.374756408Z" level=info msg="StartContainer for \"7dcbb78cc9daab5f76c3384a43aab3fb608c79599e8c4da7cf073c13288c0d32\"" Mar 17 17:53:26.432608 systemd[1]: Started cri-containerd-7dcbb78cc9daab5f76c3384a43aab3fb608c79599e8c4da7cf073c13288c0d32.scope - libcontainer container 7dcbb78cc9daab5f76c3384a43aab3fb608c79599e8c4da7cf073c13288c0d32. Mar 17 17:53:26.495803 containerd[1951]: time="2025-03-17T17:53:26.495737937Z" level=info msg="StartContainer for \"7dcbb78cc9daab5f76c3384a43aab3fb608c79599e8c4da7cf073c13288c0d32\" returns successfully" Mar 17 17:53:26.499212 containerd[1951]: time="2025-03-17T17:53:26.499146345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:53:27.282937 kubelet[2450]: E0317 17:53:27.282863 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:28.100272 containerd[1951]: time="2025-03-17T17:53:28.100210749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:28.101773 containerd[1951]: time="2025-03-17T17:53:28.101704533Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 17 17:53:28.102733 containerd[1951]: time="2025-03-17T17:53:28.102570345Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:28.106355 containerd[1951]: time="2025-03-17T17:53:28.106293033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:28.108130 containerd[1951]: time="2025-03-17T17:53:28.107955201Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.608521996s" Mar 17 17:53:28.108130 containerd[1951]: time="2025-03-17T17:53:28.108004101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 17 17:53:28.112295 containerd[1951]: time="2025-03-17T17:53:28.111958533Z" level=info msg="CreateContainer within sandbox \"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:53:28.138261 containerd[1951]: time="2025-03-17T17:53:28.138189825Z" level=info msg="CreateContainer within sandbox \"63f28ff608ea95e369750d59048567a143f2984c6f19a509c74550a8b32bd97e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"618fc5c7fe25ab627e8a87ee97a3e2fb4e4c48ebcd63a30d9f32e28abf4bdb51\"" Mar 17 17:53:28.139387 containerd[1951]: time="2025-03-17T17:53:28.139242201Z" level=info msg="StartContainer for \"618fc5c7fe25ab627e8a87ee97a3e2fb4e4c48ebcd63a30d9f32e28abf4bdb51\"" Mar 17 17:53:28.195006 systemd[1]: Started cri-containerd-618fc5c7fe25ab627e8a87ee97a3e2fb4e4c48ebcd63a30d9f32e28abf4bdb51.scope - libcontainer container 618fc5c7fe25ab627e8a87ee97a3e2fb4e4c48ebcd63a30d9f32e28abf4bdb51. Mar 17 17:53:28.252007 containerd[1951]: time="2025-03-17T17:53:28.251933073Z" level=info msg="StartContainer for \"618fc5c7fe25ab627e8a87ee97a3e2fb4e4c48ebcd63a30d9f32e28abf4bdb51\" returns successfully" Mar 17 17:53:28.283327 kubelet[2450]: E0317 17:53:28.283281 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:28.423258 kubelet[2450]: I0317 17:53:28.423079 2450 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:53:28.423258 kubelet[2450]: I0317 17:53:28.423167 2450 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:53:28.839278 kubelet[2450]: I0317 17:53:28.839146 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-s8xfx" podStartSLOduration=22.985014856 podStartE2EDuration="31.839126844s" podCreationTimestamp="2025-03-17 17:52:57 +0000 UTC" firstStartedPulling="2025-03-17 17:53:19.255442621 +0000 UTC m=+23.816068740" lastFinishedPulling="2025-03-17 17:53:28.109554621 +0000 UTC m=+32.670180728" observedRunningTime="2025-03-17 17:53:28.838322292 +0000 UTC m=+33.398948435" watchObservedRunningTime="2025-03-17 17:53:28.839126844 +0000 UTC m=+33.399752975" Mar 17 17:53:29.283793 kubelet[2450]: E0317 17:53:29.283734 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:30.284186 kubelet[2450]: E0317 17:53:30.284126 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:31.009486 systemd[1]: Created slice kubepods-besteffort-podc71d4d2e_d04e_4aa3_9998_753b1d46736c.slice - libcontainer container kubepods-besteffort-podc71d4d2e_d04e_4aa3_9998_753b1d46736c.slice. Mar 17 17:53:31.030989 kubelet[2450]: I0317 17:53:31.030839 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxm56\" (UniqueName: \"kubernetes.io/projected/c71d4d2e-d04e-4aa3-9998-753b1d46736c-kube-api-access-wxm56\") pod \"nfs-server-provisioner-0\" (UID: \"c71d4d2e-d04e-4aa3-9998-753b1d46736c\") " pod="default/nfs-server-provisioner-0" Mar 17 17:53:31.030989 kubelet[2450]: I0317 17:53:31.030899 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c71d4d2e-d04e-4aa3-9998-753b1d46736c-data\") pod \"nfs-server-provisioner-0\" (UID: \"c71d4d2e-d04e-4aa3-9998-753b1d46736c\") " pod="default/nfs-server-provisioner-0" Mar 17 17:53:31.284471 kubelet[2450]: E0317 17:53:31.284322 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:31.316333 containerd[1951]: time="2025-03-17T17:53:31.316254865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:c71d4d2e-d04e-4aa3-9998-753b1d46736c,Namespace:default,Attempt:0,}" Mar 17 17:53:31.552641 systemd-networkd[1855]: cali60e51b789ff: Link UP Mar 17 17:53:31.554420 systemd-networkd[1855]: cali60e51b789ff: Gained carrier Mar 17 17:53:31.562327 (udev-worker)[4388]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.403 [INFO][4369] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.27.21-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default c71d4d2e-d04e-4aa3-9998-753b1d46736c 1286 0 2025-03-17 17:53:30 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 172.31.27.21 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.404 [INFO][4369] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.458 [INFO][4380] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" HandleID="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Workload="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.481 [INFO][4380] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" HandleID="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Workload="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003195c0), Attrs:map[string]string{"namespace":"default", "node":"172.31.27.21", "pod":"nfs-server-provisioner-0", "timestamp":"2025-03-17 17:53:31.458083321 +0000 UTC"}, Hostname:"172.31.27.21", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.481 [INFO][4380] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.481 [INFO][4380] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.481 [INFO][4380] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.27.21' Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.485 [INFO][4380] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.490 [INFO][4380] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.497 [INFO][4380] ipam/ipam.go 489: Trying affinity for 192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.501 [INFO][4380] ipam/ipam.go 155: Attempting to load block cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.508 [INFO][4380] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.508 [INFO][4380] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.511 [INFO][4380] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319 Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.522 [INFO][4380] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.543 [INFO][4380] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.110.195/26] block=192.168.110.192/26 handle="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.543 [INFO][4380] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.110.195/26] handle="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" host="172.31.27.21" Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.543 [INFO][4380] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:53:31.586025 containerd[1951]: 2025-03-17 17:53:31.543 [INFO][4380] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.195/26] IPv6=[] ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" HandleID="k8s-pod-network.4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Workload="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.587174 containerd[1951]: 2025-03-17 17:53:31.545 [INFO][4369] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"c71d4d2e-d04e-4aa3-9998-753b1d46736c", ResourceVersion:"1286", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.110.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:31.587174 containerd[1951]: 2025-03-17 17:53:31.546 [INFO][4369] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.110.195/32] ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.587174 containerd[1951]: 2025-03-17 17:53:31.546 [INFO][4369] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.587174 containerd[1951]: 2025-03-17 17:53:31.552 [INFO][4369] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.587535 containerd[1951]: 2025-03-17 17:53:31.555 [INFO][4369] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"c71d4d2e-d04e-4aa3-9998-753b1d46736c", ResourceVersion:"1286", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.110.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"7a:7b:8f:a7:3d:f7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:53:31.587535 containerd[1951]: 2025-03-17 17:53:31.583 [INFO][4369] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.27.21-k8s-nfs--server--provisioner--0-eth0" Mar 17 17:53:31.633006 containerd[1951]: time="2025-03-17T17:53:31.632149586Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:53:31.633006 containerd[1951]: time="2025-03-17T17:53:31.632234486Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:53:31.633006 containerd[1951]: time="2025-03-17T17:53:31.632258990Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:31.633006 containerd[1951]: time="2025-03-17T17:53:31.632382050Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:53:31.674045 systemd[1]: Started cri-containerd-4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319.scope - libcontainer container 4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319. Mar 17 17:53:31.740744 containerd[1951]: time="2025-03-17T17:53:31.740656143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:c71d4d2e-d04e-4aa3-9998-753b1d46736c,Namespace:default,Attempt:0,} returns sandbox id \"4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319\"" Mar 17 17:53:31.743739 containerd[1951]: time="2025-03-17T17:53:31.743075763Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Mar 17 17:53:32.285416 kubelet[2450]: E0317 17:53:32.285333 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:32.979779 systemd-networkd[1855]: cali60e51b789ff: Gained IPv6LL Mar 17 17:53:33.285674 kubelet[2450]: E0317 17:53:33.285522 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:34.286660 kubelet[2450]: E0317 17:53:34.286609 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:34.349211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2115593491.mount: Deactivated successfully. Mar 17 17:53:35.287234 kubelet[2450]: E0317 17:53:35.287050 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:35.596679 ntpd[1927]: Listen normally on 12 cali60e51b789ff [fe80::ecee:eeff:feee:eeee%8]:123 Mar 17 17:53:35.598793 ntpd[1927]: 17 Mar 17:53:35 ntpd[1927]: Listen normally on 12 cali60e51b789ff [fe80::ecee:eeff:feee:eeee%8]:123 Mar 17 17:53:36.288094 kubelet[2450]: E0317 17:53:36.288018 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:37.204599 containerd[1951]: time="2025-03-17T17:53:37.204542178Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:37.207241 containerd[1951]: time="2025-03-17T17:53:37.207167598Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=87373623" Mar 17 17:53:37.209398 containerd[1951]: time="2025-03-17T17:53:37.209320134Z" level=info msg="ImageCreate event name:\"sha256:5a42a519e0a8cf95c3c5f18f767c58c8c8b072aaea0a26e5e47a6f206c7df685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:37.215194 containerd[1951]: time="2025-03-17T17:53:37.215113218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:53:37.217277 containerd[1951]: time="2025-03-17T17:53:37.217067070Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:5a42a519e0a8cf95c3c5f18f767c58c8c8b072aaea0a26e5e47a6f206c7df685\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"87371201\" in 5.473932735s" Mar 17 17:53:37.217277 containerd[1951]: time="2025-03-17T17:53:37.217127694Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:5a42a519e0a8cf95c3c5f18f767c58c8c8b072aaea0a26e5e47a6f206c7df685\"" Mar 17 17:53:37.222014 containerd[1951]: time="2025-03-17T17:53:37.221958306Z" level=info msg="CreateContainer within sandbox \"4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Mar 17 17:53:37.251396 containerd[1951]: time="2025-03-17T17:53:37.251318502Z" level=info msg="CreateContainer within sandbox \"4ed3a8c9c17a22d6ceb1151145c573ee726ce158396df8c5e190488fd46da319\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"71df1f10400947c616e75a0d9e4e40b8c3fdda872751a13b674462a341cf7985\"" Mar 17 17:53:37.252474 containerd[1951]: time="2025-03-17T17:53:37.252197058Z" level=info msg="StartContainer for \"71df1f10400947c616e75a0d9e4e40b8c3fdda872751a13b674462a341cf7985\"" Mar 17 17:53:37.260630 kubelet[2450]: E0317 17:53:37.260572 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:37.291045 kubelet[2450]: E0317 17:53:37.290893 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:37.306013 systemd[1]: Started cri-containerd-71df1f10400947c616e75a0d9e4e40b8c3fdda872751a13b674462a341cf7985.scope - libcontainer container 71df1f10400947c616e75a0d9e4e40b8c3fdda872751a13b674462a341cf7985. Mar 17 17:53:37.352009 containerd[1951]: time="2025-03-17T17:53:37.351925435Z" level=info msg="StartContainer for \"71df1f10400947c616e75a0d9e4e40b8c3fdda872751a13b674462a341cf7985\" returns successfully" Mar 17 17:53:37.882306 kubelet[2450]: I0317 17:53:37.882222 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=2.405851066 podStartE2EDuration="7.882203781s" podCreationTimestamp="2025-03-17 17:53:30 +0000 UTC" firstStartedPulling="2025-03-17 17:53:31.742615191 +0000 UTC m=+36.303241310" lastFinishedPulling="2025-03-17 17:53:37.218967906 +0000 UTC m=+41.779594025" observedRunningTime="2025-03-17 17:53:37.879653109 +0000 UTC m=+42.440279240" watchObservedRunningTime="2025-03-17 17:53:37.882203781 +0000 UTC m=+42.442829900" Mar 17 17:53:38.291513 kubelet[2450]: E0317 17:53:38.291447 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:39.291720 kubelet[2450]: E0317 17:53:39.291587 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:40.292185 kubelet[2450]: E0317 17:53:40.292125 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:41.292703 kubelet[2450]: E0317 17:53:41.292634 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:42.293832 kubelet[2450]: E0317 17:53:42.293743 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:43.294147 kubelet[2450]: E0317 17:53:43.294090 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:44.294804 kubelet[2450]: E0317 17:53:44.294730 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:45.295717 kubelet[2450]: E0317 17:53:45.295611 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:46.296296 kubelet[2450]: E0317 17:53:46.296236 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:47.296573 kubelet[2450]: E0317 17:53:47.296514 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:48.297099 kubelet[2450]: E0317 17:53:48.297041 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:49.297877 kubelet[2450]: E0317 17:53:49.297812 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:50.298789 kubelet[2450]: E0317 17:53:50.298730 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:51.299816 kubelet[2450]: E0317 17:53:51.299758 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:52.300359 kubelet[2450]: E0317 17:53:52.300294 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:53.300737 kubelet[2450]: E0317 17:53:53.300647 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:54.301301 kubelet[2450]: E0317 17:53:54.301239 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:55.302170 kubelet[2450]: E0317 17:53:55.302105 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:56.302619 kubelet[2450]: E0317 17:53:56.302557 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:57.260437 kubelet[2450]: E0317 17:53:57.260388 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:57.294386 containerd[1951]: time="2025-03-17T17:53:57.294128882Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:57.294386 containerd[1951]: time="2025-03-17T17:53:57.294303950Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:57.294386 containerd[1951]: time="2025-03-17T17:53:57.294326150Z" level=info msg="StopPodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:57.296071 containerd[1951]: time="2025-03-17T17:53:57.295917362Z" level=info msg="RemovePodSandbox for \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:57.296071 containerd[1951]: time="2025-03-17T17:53:57.295973210Z" level=info msg="Forcibly stopping sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\"" Mar 17 17:53:57.296360 containerd[1951]: time="2025-03-17T17:53:57.296106182Z" level=info msg="TearDown network for sandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" successfully" Mar 17 17:53:57.301957 containerd[1951]: time="2025-03-17T17:53:57.301882910Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.302194 containerd[1951]: time="2025-03-17T17:53:57.301974974Z" level=info msg="RemovePodSandbox \"0107d7185ff899d7dddd4c2bf20eeba07c488da8d41948d7da968b4296274dae\" returns successfully" Mar 17 17:53:57.302888 kubelet[2450]: E0317 17:53:57.302729 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.303023786Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.303187502Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.303208706Z" level=info msg="StopPodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.304186442Z" level=info msg="RemovePodSandbox for \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.304242986Z" level=info msg="Forcibly stopping sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\"" Mar 17 17:53:57.305807 containerd[1951]: time="2025-03-17T17:53:57.304378178Z" level=info msg="TearDown network for sandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" successfully" Mar 17 17:53:57.310601 containerd[1951]: time="2025-03-17T17:53:57.310539182Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.310857 containerd[1951]: time="2025-03-17T17:53:57.310825034Z" level=info msg="RemovePodSandbox \"669487ddae0d964596a50ff9632bdc271c46a7d2946fd899940112e84e076ed1\" returns successfully" Mar 17 17:53:57.311617 containerd[1951]: time="2025-03-17T17:53:57.311546834Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:57.311785 containerd[1951]: time="2025-03-17T17:53:57.311756726Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:57.311850 containerd[1951]: time="2025-03-17T17:53:57.311781170Z" level=info msg="StopPodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:57.312568 containerd[1951]: time="2025-03-17T17:53:57.312513422Z" level=info msg="RemovePodSandbox for \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:57.312568 containerd[1951]: time="2025-03-17T17:53:57.312563534Z" level=info msg="Forcibly stopping sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\"" Mar 17 17:53:57.312759 containerd[1951]: time="2025-03-17T17:53:57.312721850Z" level=info msg="TearDown network for sandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" successfully" Mar 17 17:53:57.318344 containerd[1951]: time="2025-03-17T17:53:57.318262358Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.318344 containerd[1951]: time="2025-03-17T17:53:57.318339350Z" level=info msg="RemovePodSandbox \"c1fbf18734908c830c8f602a2056f1883eb5f7dbc6c2e1b54bdf751213bb49f9\" returns successfully" Mar 17 17:53:57.319410 containerd[1951]: time="2025-03-17T17:53:57.319126970Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:57.319410 containerd[1951]: time="2025-03-17T17:53:57.319287050Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:57.319410 containerd[1951]: time="2025-03-17T17:53:57.319308938Z" level=info msg="StopPodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:57.320144 containerd[1951]: time="2025-03-17T17:53:57.319986998Z" level=info msg="RemovePodSandbox for \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:57.320144 containerd[1951]: time="2025-03-17T17:53:57.320029154Z" level=info msg="Forcibly stopping sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\"" Mar 17 17:53:57.320475 containerd[1951]: time="2025-03-17T17:53:57.320357630Z" level=info msg="TearDown network for sandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" successfully" Mar 17 17:53:57.327068 containerd[1951]: time="2025-03-17T17:53:57.326832470Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.327068 containerd[1951]: time="2025-03-17T17:53:57.326915222Z" level=info msg="RemovePodSandbox \"7267bf8ef43b44fef54015022ef3fa56bebd9c99d2224a5bae3c41da81ab46c9\" returns successfully" Mar 17 17:53:57.327919 containerd[1951]: time="2025-03-17T17:53:57.327527738Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:57.327919 containerd[1951]: time="2025-03-17T17:53:57.327783494Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:57.327919 containerd[1951]: time="2025-03-17T17:53:57.327806906Z" level=info msg="StopPodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:57.328978 containerd[1951]: time="2025-03-17T17:53:57.328938842Z" level=info msg="RemovePodSandbox for \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:57.329599 containerd[1951]: time="2025-03-17T17:53:57.329216318Z" level=info msg="Forcibly stopping sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\"" Mar 17 17:53:57.329599 containerd[1951]: time="2025-03-17T17:53:57.329358050Z" level=info msg="TearDown network for sandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" successfully" Mar 17 17:53:57.335142 containerd[1951]: time="2025-03-17T17:53:57.335036546Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.335142 containerd[1951]: time="2025-03-17T17:53:57.335111546Z" level=info msg="RemovePodSandbox \"062de916487b1617b43fdef7c0c9e5193faccf8085290f6f5f9fea62040ffd83\" returns successfully" Mar 17 17:53:57.336197 containerd[1951]: time="2025-03-17T17:53:57.335919422Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:57.336197 containerd[1951]: time="2025-03-17T17:53:57.336075914Z" level=info msg="TearDown network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" successfully" Mar 17 17:53:57.336197 containerd[1951]: time="2025-03-17T17:53:57.336096590Z" level=info msg="StopPodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" returns successfully" Mar 17 17:53:57.337499 containerd[1951]: time="2025-03-17T17:53:57.336626798Z" level=info msg="RemovePodSandbox for \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:57.337499 containerd[1951]: time="2025-03-17T17:53:57.336671534Z" level=info msg="Forcibly stopping sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\"" Mar 17 17:53:57.337499 containerd[1951]: time="2025-03-17T17:53:57.336817454Z" level=info msg="TearDown network for sandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" successfully" Mar 17 17:53:57.342164 containerd[1951]: time="2025-03-17T17:53:57.342097070Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.342301 containerd[1951]: time="2025-03-17T17:53:57.342182306Z" level=info msg="RemovePodSandbox \"685b784bf4cdc3beb08f2035e1fe25db22d8dcd8aac87fffc66aae0de10a73fc\" returns successfully" Mar 17 17:53:57.343463 containerd[1951]: time="2025-03-17T17:53:57.343220402Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" Mar 17 17:53:57.343463 containerd[1951]: time="2025-03-17T17:53:57.343376990Z" level=info msg="TearDown network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" successfully" Mar 17 17:53:57.343463 containerd[1951]: time="2025-03-17T17:53:57.343397726Z" level=info msg="StopPodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" returns successfully" Mar 17 17:53:57.345708 containerd[1951]: time="2025-03-17T17:53:57.344098718Z" level=info msg="RemovePodSandbox for \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" Mar 17 17:53:57.345708 containerd[1951]: time="2025-03-17T17:53:57.344143874Z" level=info msg="Forcibly stopping sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\"" Mar 17 17:53:57.345708 containerd[1951]: time="2025-03-17T17:53:57.344268962Z" level=info msg="TearDown network for sandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" successfully" Mar 17 17:53:57.349680 containerd[1951]: time="2025-03-17T17:53:57.349577606Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.349680 containerd[1951]: time="2025-03-17T17:53:57.349656830Z" level=info msg="RemovePodSandbox \"52babd41ef11686417ff38d0328b2d3aa0b8bec093f6370740fad748062a29cc\" returns successfully" Mar 17 17:53:57.350520 containerd[1951]: time="2025-03-17T17:53:57.350441666Z" level=info msg="StopPodSandbox for \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\"" Mar 17 17:53:57.351017 containerd[1951]: time="2025-03-17T17:53:57.350870114Z" level=info msg="TearDown network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" successfully" Mar 17 17:53:57.351017 containerd[1951]: time="2025-03-17T17:53:57.350900942Z" level=info msg="StopPodSandbox for \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" returns successfully" Mar 17 17:53:57.351880 containerd[1951]: time="2025-03-17T17:53:57.351374270Z" level=info msg="RemovePodSandbox for \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\"" Mar 17 17:53:57.351880 containerd[1951]: time="2025-03-17T17:53:57.351425690Z" level=info msg="Forcibly stopping sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\"" Mar 17 17:53:57.351880 containerd[1951]: time="2025-03-17T17:53:57.351547526Z" level=info msg="TearDown network for sandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" successfully" Mar 17 17:53:57.357127 containerd[1951]: time="2025-03-17T17:53:57.357059066Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.357260 containerd[1951]: time="2025-03-17T17:53:57.357140810Z" level=info msg="RemovePodSandbox \"623f0076b5e2122afe8d5c2cfaa6acbf5dc1f9cd35b82857be8bd1f89eeede76\" returns successfully" Mar 17 17:53:57.358220 containerd[1951]: time="2025-03-17T17:53:57.357912062Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:57.358220 containerd[1951]: time="2025-03-17T17:53:57.358110038Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:57.358220 containerd[1951]: time="2025-03-17T17:53:57.358133258Z" level=info msg="StopPodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:57.359053 containerd[1951]: time="2025-03-17T17:53:57.358955594Z" level=info msg="RemovePodSandbox for \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:57.359053 containerd[1951]: time="2025-03-17T17:53:57.359013110Z" level=info msg="Forcibly stopping sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\"" Mar 17 17:53:57.359202 containerd[1951]: time="2025-03-17T17:53:57.359141426Z" level=info msg="TearDown network for sandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" successfully" Mar 17 17:53:57.364603 containerd[1951]: time="2025-03-17T17:53:57.364539974Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.364768 containerd[1951]: time="2025-03-17T17:53:57.364623278Z" level=info msg="RemovePodSandbox \"e53d6555999288730a7d3672822eae75726b9909784c91c4a093ee27c3d61c9d\" returns successfully" Mar 17 17:53:57.365735 containerd[1951]: time="2025-03-17T17:53:57.365624270Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:57.365884 containerd[1951]: time="2025-03-17T17:53:57.365824862Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:57.365884 containerd[1951]: time="2025-03-17T17:53:57.365848538Z" level=info msg="StopPodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:57.366631 containerd[1951]: time="2025-03-17T17:53:57.366589778Z" level=info msg="RemovePodSandbox for \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:57.366756 containerd[1951]: time="2025-03-17T17:53:57.366635942Z" level=info msg="Forcibly stopping sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\"" Mar 17 17:53:57.366891 containerd[1951]: time="2025-03-17T17:53:57.366856190Z" level=info msg="TearDown network for sandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" successfully" Mar 17 17:53:57.372534 containerd[1951]: time="2025-03-17T17:53:57.372468182Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.372809 containerd[1951]: time="2025-03-17T17:53:57.372554870Z" level=info msg="RemovePodSandbox \"93b6af2c61a2769fbf0385fd635ce332cf4dcffe788d1b9ade214f7af0362106\" returns successfully" Mar 17 17:53:57.373466 containerd[1951]: time="2025-03-17T17:53:57.373408418Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:57.373606 containerd[1951]: time="2025-03-17T17:53:57.373585142Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:57.373736 containerd[1951]: time="2025-03-17T17:53:57.373608614Z" level=info msg="StopPodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:57.374535 containerd[1951]: time="2025-03-17T17:53:57.374494178Z" level=info msg="RemovePodSandbox for \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:57.374641 containerd[1951]: time="2025-03-17T17:53:57.374546342Z" level=info msg="Forcibly stopping sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\"" Mar 17 17:53:57.374734 containerd[1951]: time="2025-03-17T17:53:57.374671838Z" level=info msg="TearDown network for sandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" successfully" Mar 17 17:53:57.380174 containerd[1951]: time="2025-03-17T17:53:57.380078678Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.380302 containerd[1951]: time="2025-03-17T17:53:57.380227454Z" level=info msg="RemovePodSandbox \"97c508b9899f4dc2cfe9343bf64f263e0372736abc5459bd7a013b37140c1550\" returns successfully" Mar 17 17:53:57.380975 containerd[1951]: time="2025-03-17T17:53:57.380918870Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:57.381124 containerd[1951]: time="2025-03-17T17:53:57.381090290Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:57.381184 containerd[1951]: time="2025-03-17T17:53:57.381122774Z" level=info msg="StopPodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:57.382227 containerd[1951]: time="2025-03-17T17:53:57.382169246Z" level=info msg="RemovePodSandbox for \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:57.382227 containerd[1951]: time="2025-03-17T17:53:57.382220378Z" level=info msg="Forcibly stopping sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\"" Mar 17 17:53:57.382398 containerd[1951]: time="2025-03-17T17:53:57.382360562Z" level=info msg="TearDown network for sandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" successfully" Mar 17 17:53:57.388143 containerd[1951]: time="2025-03-17T17:53:57.388044998Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.388274 containerd[1951]: time="2025-03-17T17:53:57.388219502Z" level=info msg="RemovePodSandbox \"ca696f05592b349efc20bc8a367c5bc9112283154be77d4145621002ee01658c\" returns successfully" Mar 17 17:53:57.388927 containerd[1951]: time="2025-03-17T17:53:57.388860434Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:57.389310 containerd[1951]: time="2025-03-17T17:53:57.389037110Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:57.389310 containerd[1951]: time="2025-03-17T17:53:57.389070338Z" level=info msg="StopPodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:57.389498 containerd[1951]: time="2025-03-17T17:53:57.389470634Z" level=info msg="RemovePodSandbox for \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:57.389552 containerd[1951]: time="2025-03-17T17:53:57.389509994Z" level=info msg="Forcibly stopping sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\"" Mar 17 17:53:57.389915 containerd[1951]: time="2025-03-17T17:53:57.389631122Z" level=info msg="TearDown network for sandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" successfully" Mar 17 17:53:57.395236 containerd[1951]: time="2025-03-17T17:53:57.395168666Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.395347 containerd[1951]: time="2025-03-17T17:53:57.395251142Z" level=info msg="RemovePodSandbox \"be55cd18728d4373a8f84072f78f39b81c9dfb50cab22b7ed61534e619fa2019\" returns successfully" Mar 17 17:53:57.396252 containerd[1951]: time="2025-03-17T17:53:57.396000134Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:57.396252 containerd[1951]: time="2025-03-17T17:53:57.396160406Z" level=info msg="TearDown network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" successfully" Mar 17 17:53:57.396252 containerd[1951]: time="2025-03-17T17:53:57.396184238Z" level=info msg="StopPodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" returns successfully" Mar 17 17:53:57.397076 containerd[1951]: time="2025-03-17T17:53:57.396776558Z" level=info msg="RemovePodSandbox for \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:57.397076 containerd[1951]: time="2025-03-17T17:53:57.396824618Z" level=info msg="Forcibly stopping sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\"" Mar 17 17:53:57.397076 containerd[1951]: time="2025-03-17T17:53:57.396952862Z" level=info msg="TearDown network for sandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" successfully" Mar 17 17:53:57.402302 containerd[1951]: time="2025-03-17T17:53:57.402233690Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.402418 containerd[1951]: time="2025-03-17T17:53:57.402316526Z" level=info msg="RemovePodSandbox \"bd01c1e9feb26f2c200304ffe9488bb6588807928c98a894f742f9ac05dbecc7\" returns successfully" Mar 17 17:53:57.403303 containerd[1951]: time="2025-03-17T17:53:57.403060814Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" Mar 17 17:53:57.403303 containerd[1951]: time="2025-03-17T17:53:57.403214294Z" level=info msg="TearDown network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" successfully" Mar 17 17:53:57.403303 containerd[1951]: time="2025-03-17T17:53:57.403235774Z" level=info msg="StopPodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" returns successfully" Mar 17 17:53:57.403875 containerd[1951]: time="2025-03-17T17:53:57.403803902Z" level=info msg="RemovePodSandbox for \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" Mar 17 17:53:57.404009 containerd[1951]: time="2025-03-17T17:53:57.403853894Z" level=info msg="Forcibly stopping sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\"" Mar 17 17:53:57.404154 containerd[1951]: time="2025-03-17T17:53:57.404077526Z" level=info msg="TearDown network for sandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" successfully" Mar 17 17:53:57.409603 containerd[1951]: time="2025-03-17T17:53:57.409529882Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.409737 containerd[1951]: time="2025-03-17T17:53:57.409618946Z" level=info msg="RemovePodSandbox \"acf9689ae0eb6b7eabdfcf4cbd2939a64abef9fe9280e5fb79caf2bba47dc61a\" returns successfully" Mar 17 17:53:57.410757 containerd[1951]: time="2025-03-17T17:53:57.410663630Z" level=info msg="StopPodSandbox for \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\"" Mar 17 17:53:57.410915 containerd[1951]: time="2025-03-17T17:53:57.410869658Z" level=info msg="TearDown network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" successfully" Mar 17 17:53:57.410915 containerd[1951]: time="2025-03-17T17:53:57.410905202Z" level=info msg="StopPodSandbox for \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" returns successfully" Mar 17 17:53:57.412056 containerd[1951]: time="2025-03-17T17:53:57.412008626Z" level=info msg="RemovePodSandbox for \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\"" Mar 17 17:53:57.412194 containerd[1951]: time="2025-03-17T17:53:57.412058726Z" level=info msg="Forcibly stopping sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\"" Mar 17 17:53:57.412255 containerd[1951]: time="2025-03-17T17:53:57.412190990Z" level=info msg="TearDown network for sandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" successfully" Mar 17 17:53:57.420574 containerd[1951]: time="2025-03-17T17:53:57.420433466Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:53:57.420574 containerd[1951]: time="2025-03-17T17:53:57.420527066Z" level=info msg="RemovePodSandbox \"01c233f36e5ff06d3816cd863149db2f8194478a7ff9351d58491297df0fc1ee\" returns successfully" Mar 17 17:53:58.303929 kubelet[2450]: E0317 17:53:58.303860 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:53:59.304940 kubelet[2450]: E0317 17:53:59.304881 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:00.305527 kubelet[2450]: E0317 17:54:00.305462 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:01.305738 kubelet[2450]: E0317 17:54:01.305635 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:02.038032 systemd[1]: Created slice kubepods-besteffort-pod0af25c68_6b65_4473_b7a6_5a4168928694.slice - libcontainer container kubepods-besteffort-pod0af25c68_6b65_4473_b7a6_5a4168928694.slice. Mar 17 17:54:02.123911 kubelet[2450]: I0317 17:54:02.123848 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df41ea7a-894f-4e34-a649-34fd8eab5edc\" (UniqueName: \"kubernetes.io/nfs/0af25c68-6b65-4473-b7a6-5a4168928694-pvc-df41ea7a-894f-4e34-a649-34fd8eab5edc\") pod \"test-pod-1\" (UID: \"0af25c68-6b65-4473-b7a6-5a4168928694\") " pod="default/test-pod-1" Mar 17 17:54:02.124087 kubelet[2450]: I0317 17:54:02.123922 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr9jg\" (UniqueName: \"kubernetes.io/projected/0af25c68-6b65-4473-b7a6-5a4168928694-kube-api-access-xr9jg\") pod \"test-pod-1\" (UID: \"0af25c68-6b65-4473-b7a6-5a4168928694\") " pod="default/test-pod-1" Mar 17 17:54:02.260638 kernel: FS-Cache: Loaded Mar 17 17:54:02.303211 kernel: RPC: Registered named UNIX socket transport module. Mar 17 17:54:02.303330 kernel: RPC: Registered udp transport module. Mar 17 17:54:02.303365 kernel: RPC: Registered tcp transport module. Mar 17 17:54:02.305253 kernel: RPC: Registered tcp-with-tls transport module. Mar 17 17:54:02.306586 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Mar 17 17:54:02.306677 kubelet[2450]: E0317 17:54:02.306650 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:02.611182 kernel: NFS: Registering the id_resolver key type Mar 17 17:54:02.611316 kernel: Key type id_resolver registered Mar 17 17:54:02.611360 kernel: Key type id_legacy registered Mar 17 17:54:02.647291 nfsidmap[4606]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'localdomain' Mar 17 17:54:02.652282 nfsidmap[4607]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'localdomain' Mar 17 17:54:02.943864 containerd[1951]: time="2025-03-17T17:54:02.943628122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:0af25c68-6b65-4473-b7a6-5a4168928694,Namespace:default,Attempt:0,}" Mar 17 17:54:03.154317 (udev-worker)[4592]: Network interface NamePolicy= disabled on kernel command line. Mar 17 17:54:03.156798 systemd-networkd[1855]: cali5ec59c6bf6e: Link UP Mar 17 17:54:03.157244 systemd-networkd[1855]: cali5ec59c6bf6e: Gained carrier Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.033 [INFO][4608] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.27.21-k8s-test--pod--1-eth0 default 0af25c68-6b65-4473-b7a6-5a4168928694 1388 0 2025-03-17 17:53:31 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 172.31.27.21 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.034 [INFO][4608] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.079 [INFO][4620] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" HandleID="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Workload="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.098 [INFO][4620] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" HandleID="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Workload="172.31.27.21-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000220ba0), Attrs:map[string]string{"namespace":"default", "node":"172.31.27.21", "pod":"test-pod-1", "timestamp":"2025-03-17 17:54:03.079090206 +0000 UTC"}, Hostname:"172.31.27.21", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.098 [INFO][4620] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.098 [INFO][4620] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.098 [INFO][4620] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.27.21' Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.102 [INFO][4620] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.110 [INFO][4620] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.120 [INFO][4620] ipam/ipam.go 489: Trying affinity for 192.168.110.192/26 host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.124 [INFO][4620] ipam/ipam.go 155: Attempting to load block cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.127 [INFO][4620] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.128 [INFO][4620] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.130 [INFO][4620] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.136 [INFO][4620] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.147 [INFO][4620] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.110.196/26] block=192.168.110.192/26 handle="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.147 [INFO][4620] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.110.196/26] handle="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" host="172.31.27.21" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.147 [INFO][4620] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.147 [INFO][4620] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.196/26] IPv6=[] ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" HandleID="k8s-pod-network.bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Workload="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.176001 containerd[1951]: 2025-03-17 17:54:03.150 [INFO][4608] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"0af25c68-6b65-4473-b7a6-5a4168928694", ResourceVersion:"1388", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.110.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:54:03.177354 containerd[1951]: 2025-03-17 17:54:03.150 [INFO][4608] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.110.196/32] ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.177354 containerd[1951]: 2025-03-17 17:54:03.150 [INFO][4608] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.177354 containerd[1951]: 2025-03-17 17:54:03.156 [INFO][4608] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.177354 containerd[1951]: 2025-03-17 17:54:03.159 [INFO][4608] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.27.21-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"0af25c68-6b65-4473-b7a6-5a4168928694", ResourceVersion:"1388", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 53, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.27.21", ContainerID:"bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.110.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"46:65:38:84:a1:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:54:03.177354 containerd[1951]: 2025-03-17 17:54:03.170 [INFO][4608] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.27.21-k8s-test--pod--1-eth0" Mar 17 17:54:03.215515 containerd[1951]: time="2025-03-17T17:54:03.215199283Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:54:03.215515 containerd[1951]: time="2025-03-17T17:54:03.215368351Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:54:03.216066 containerd[1951]: time="2025-03-17T17:54:03.215765407Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:54:03.216066 containerd[1951]: time="2025-03-17T17:54:03.215963635Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:54:03.259372 systemd[1]: Started cri-containerd-bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b.scope - libcontainer container bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b. Mar 17 17:54:03.307196 kubelet[2450]: E0317 17:54:03.306852 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:03.321078 containerd[1951]: time="2025-03-17T17:54:03.321024895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:0af25c68-6b65-4473-b7a6-5a4168928694,Namespace:default,Attempt:0,} returns sandbox id \"bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b\"" Mar 17 17:54:03.323232 containerd[1951]: time="2025-03-17T17:54:03.323185304Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Mar 17 17:54:03.709864 containerd[1951]: time="2025-03-17T17:54:03.709772205Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:54:03.711883 containerd[1951]: time="2025-03-17T17:54:03.711813405Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Mar 17 17:54:03.718970 containerd[1951]: time="2025-03-17T17:54:03.718903413Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:f660a383148a8217a75a455efeb8bfd4cbe3afa737712cc0e25f27c03b770dd4\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\", size \"69703745\" in 395.371261ms" Mar 17 17:54:03.718970 containerd[1951]: time="2025-03-17T17:54:03.718963977Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:f660a383148a8217a75a455efeb8bfd4cbe3afa737712cc0e25f27c03b770dd4\"" Mar 17 17:54:03.722508 containerd[1951]: time="2025-03-17T17:54:03.722234277Z" level=info msg="CreateContainer within sandbox \"bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b\" for container &ContainerMetadata{Name:test,Attempt:0,}" Mar 17 17:54:03.747288 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2055990922.mount: Deactivated successfully. Mar 17 17:54:03.749275 containerd[1951]: time="2025-03-17T17:54:03.749103454Z" level=info msg="CreateContainer within sandbox \"bd695143a83d3357c792de980c30a14f2b487bee70bcf522a5386b0dec5d412b\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"7f6e6d42891547d26c9675c3af334f15c4a38a49d406eb1c2572ac8705bfe2a9\"" Mar 17 17:54:03.750754 containerd[1951]: time="2025-03-17T17:54:03.750344218Z" level=info msg="StartContainer for \"7f6e6d42891547d26c9675c3af334f15c4a38a49d406eb1c2572ac8705bfe2a9\"" Mar 17 17:54:03.806013 systemd[1]: Started cri-containerd-7f6e6d42891547d26c9675c3af334f15c4a38a49d406eb1c2572ac8705bfe2a9.scope - libcontainer container 7f6e6d42891547d26c9675c3af334f15c4a38a49d406eb1c2572ac8705bfe2a9. Mar 17 17:54:03.852709 containerd[1951]: time="2025-03-17T17:54:03.852469546Z" level=info msg="StartContainer for \"7f6e6d42891547d26c9675c3af334f15c4a38a49d406eb1c2572ac8705bfe2a9\" returns successfully" Mar 17 17:54:03.946439 kubelet[2450]: I0317 17:54:03.946254 2450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/test-pod-1" podStartSLOduration=32.548815078 podStartE2EDuration="32.946233263s" podCreationTimestamp="2025-03-17 17:53:31 +0000 UTC" firstStartedPulling="2025-03-17 17:54:03.32249282 +0000 UTC m=+67.883118939" lastFinishedPulling="2025-03-17 17:54:03.719911017 +0000 UTC m=+68.280537124" observedRunningTime="2025-03-17 17:54:03.945867539 +0000 UTC m=+68.506493682" watchObservedRunningTime="2025-03-17 17:54:03.946233263 +0000 UTC m=+68.506859382" Mar 17 17:54:04.307501 kubelet[2450]: E0317 17:54:04.307436 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:04.915498 systemd-networkd[1855]: cali5ec59c6bf6e: Gained IPv6LL Mar 17 17:54:05.308580 kubelet[2450]: E0317 17:54:05.308506 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:06.309418 kubelet[2450]: E0317 17:54:06.309333 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:07.310592 kubelet[2450]: E0317 17:54:07.310524 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:07.596827 ntpd[1927]: Listen normally on 13 cali5ec59c6bf6e [fe80::ecee:eeff:feee:eeee%9]:123 Mar 17 17:54:07.597492 ntpd[1927]: 17 Mar 17:54:07 ntpd[1927]: Listen normally on 13 cali5ec59c6bf6e [fe80::ecee:eeff:feee:eeee%9]:123 Mar 17 17:54:08.311406 kubelet[2450]: E0317 17:54:08.311340 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:09.311946 kubelet[2450]: E0317 17:54:09.311897 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:10.312845 kubelet[2450]: E0317 17:54:10.312783 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:11.313799 kubelet[2450]: E0317 17:54:11.313738 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:12.314797 kubelet[2450]: E0317 17:54:12.314737 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:13.315385 kubelet[2450]: E0317 17:54:13.315325 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:14.315652 kubelet[2450]: E0317 17:54:14.315590 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:15.316559 kubelet[2450]: E0317 17:54:15.316496 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:16.317069 kubelet[2450]: E0317 17:54:16.317011 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:17.260587 kubelet[2450]: E0317 17:54:17.260526 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:17.317665 kubelet[2450]: E0317 17:54:17.317591 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:18.318023 kubelet[2450]: E0317 17:54:18.317961 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:19.318134 kubelet[2450]: E0317 17:54:19.318074 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:19.551740 kubelet[2450]: E0317 17:54:19.551564 2450 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 17:54:20.318536 kubelet[2450]: E0317 17:54:20.318460 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:21.318981 kubelet[2450]: E0317 17:54:21.318909 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:22.319702 kubelet[2450]: E0317 17:54:22.319620 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:23.319820 kubelet[2450]: E0317 17:54:23.319756 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:24.320651 kubelet[2450]: E0317 17:54:24.320592 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:25.321133 kubelet[2450]: E0317 17:54:25.321073 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:26.321734 kubelet[2450]: E0317 17:54:26.321652 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:27.322305 kubelet[2450]: E0317 17:54:27.322249 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:28.322457 kubelet[2450]: E0317 17:54:28.322387 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:29.322758 kubelet[2450]: E0317 17:54:29.322669 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:29.553030 kubelet[2450]: E0317 17:54:29.552798 2450 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 17:54:30.323369 kubelet[2450]: E0317 17:54:30.323310 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:31.324503 kubelet[2450]: E0317 17:54:31.324435 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:32.325511 kubelet[2450]: E0317 17:54:32.325451 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:33.325813 kubelet[2450]: E0317 17:54:33.325748 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:34.326708 kubelet[2450]: E0317 17:54:34.326622 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:35.327102 kubelet[2450]: E0317 17:54:35.327032 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:36.328103 kubelet[2450]: E0317 17:54:36.328045 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:37.260087 kubelet[2450]: E0317 17:54:37.260031 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:37.328712 kubelet[2450]: E0317 17:54:37.328656 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:38.329290 kubelet[2450]: E0317 17:54:38.329214 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:39.329813 kubelet[2450]: E0317 17:54:39.329752 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:39.553321 kubelet[2450]: E0317 17:54:39.553244 2450 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 17:54:40.330204 kubelet[2450]: E0317 17:54:40.330142 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:41.331150 kubelet[2450]: E0317 17:54:41.331078 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:42.331511 kubelet[2450]: E0317 17:54:42.331438 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:43.332039 kubelet[2450]: E0317 17:54:43.331973 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:44.332772 kubelet[2450]: E0317 17:54:44.332711 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:45.333660 kubelet[2450]: E0317 17:54:45.333593 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:46.334085 kubelet[2450]: E0317 17:54:46.334027 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:47.334384 kubelet[2450]: E0317 17:54:47.334311 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:48.334533 kubelet[2450]: E0317 17:54:48.334446 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:49.335221 kubelet[2450]: E0317 17:54:49.335145 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:49.554041 kubelet[2450]: E0317 17:54:49.553916 2450 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": context deadline exceeded" Mar 17 17:54:50.336163 kubelet[2450]: E0317 17:54:50.336104 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:51.337287 kubelet[2450]: E0317 17:54:51.337228 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:52.337884 kubelet[2450]: E0317 17:54:52.337816 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:53.338336 kubelet[2450]: E0317 17:54:53.338269 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:54.200622 kubelet[2450]: E0317 17:54:54.198224 2450 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": unexpected EOF" Mar 17 17:54:54.200622 kubelet[2450]: I0317 17:54:54.198318 2450 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 17 17:54:54.339421 kubelet[2450]: E0317 17:54:54.339350 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:55.214330 kubelet[2450]: E0317 17:54:55.213177 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": dial tcp 172.31.28.1:6443: connect: connection refused - error from a previous attempt: read tcp 172.31.27.21:48856->172.31.28.1:6443: read: connection reset by peer" interval="200ms" Mar 17 17:54:55.340211 kubelet[2450]: E0317 17:54:55.340138 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:56.341178 kubelet[2450]: E0317 17:54:56.341116 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:57.260602 kubelet[2450]: E0317 17:54:57.260548 2450 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:57.342260 kubelet[2450]: E0317 17:54:57.342196 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:58.343403 kubelet[2450]: E0317 17:54:58.343343 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:54:59.344316 kubelet[2450]: E0317 17:54:59.344258 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:00.344447 kubelet[2450]: E0317 17:55:00.344383 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:01.345397 kubelet[2450]: E0317 17:55:01.345351 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:02.346289 kubelet[2450]: E0317 17:55:02.346225 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:03.346915 kubelet[2450]: E0317 17:55:03.346853 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:04.348038 kubelet[2450]: E0317 17:55:04.347973 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:05.348853 kubelet[2450]: E0317 17:55:05.348791 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:05.414353 kubelet[2450]: E0317 17:55:05.414287 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.27.21?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 17 17:55:06.348983 kubelet[2450]: E0317 17:55:06.348906 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:07.349258 kubelet[2450]: E0317 17:55:07.349194 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:08.349842 kubelet[2450]: E0317 17:55:08.349781 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:09.350808 kubelet[2450]: E0317 17:55:09.350734 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:10.351547 kubelet[2450]: E0317 17:55:10.351489 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:11.352650 kubelet[2450]: E0317 17:55:11.352589 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 17:55:12.353812 kubelet[2450]: E0317 17:55:12.353751 2450 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"