Feb 13 19:46:29.403494 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 17:44:05 -00 2025 Feb 13 19:46:29.403537 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ed9b5d8ea73d2e47b8decea8124089e04dd398ef43013c1b1a5809314044b1c3 Feb 13 19:46:29.403555 kernel: BIOS-provided physical RAM map: Feb 13 19:46:29.403567 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 13 19:46:29.403579 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 13 19:46:29.403591 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 13 19:46:29.403610 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007d9e9fff] usable Feb 13 19:46:29.403623 kernel: BIOS-e820: [mem 0x000000007d9ea000-0x000000007fffffff] reserved Feb 13 19:46:29.403636 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000e03fffff] reserved Feb 13 19:46:29.403648 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 13 19:46:29.403661 kernel: NX (Execute Disable) protection: active Feb 13 19:46:29.403674 kernel: APIC: Static calls initialized Feb 13 19:46:29.403687 kernel: SMBIOS 2.7 present. Feb 13 19:46:29.403700 kernel: DMI: Amazon EC2 t3.small/, BIOS 1.0 10/16/2017 Feb 13 19:46:29.403719 kernel: Hypervisor detected: KVM Feb 13 19:46:29.405886 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 13 19:46:29.405910 kernel: kvm-clock: using sched offset of 7988504157 cycles Feb 13 19:46:29.405926 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 13 19:46:29.405941 kernel: tsc: Detected 2499.996 MHz processor Feb 13 19:46:29.405956 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 19:46:29.405971 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 19:46:29.405991 kernel: last_pfn = 0x7d9ea max_arch_pfn = 0x400000000 Feb 13 19:46:29.406005 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Feb 13 19:46:29.406019 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 19:46:29.406034 kernel: Using GB pages for direct mapping Feb 13 19:46:29.406048 kernel: ACPI: Early table checksum verification disabled Feb 13 19:46:29.406061 kernel: ACPI: RSDP 0x00000000000F8F40 000014 (v00 AMAZON) Feb 13 19:46:29.406077 kernel: ACPI: RSDT 0x000000007D9EE350 000044 (v01 AMAZON AMZNRSDT 00000001 AMZN 00000001) Feb 13 19:46:29.406091 kernel: ACPI: FACP 0x000000007D9EFF80 000074 (v01 AMAZON AMZNFACP 00000001 AMZN 00000001) Feb 13 19:46:29.406105 kernel: ACPI: DSDT 0x000000007D9EE3A0 0010E9 (v01 AMAZON AMZNDSDT 00000001 AMZN 00000001) Feb 13 19:46:29.406123 kernel: ACPI: FACS 0x000000007D9EFF40 000040 Feb 13 19:46:29.406137 kernel: ACPI: SSDT 0x000000007D9EF6C0 00087A (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Feb 13 19:46:29.406152 kernel: ACPI: APIC 0x000000007D9EF5D0 000076 (v01 AMAZON AMZNAPIC 00000001 AMZN 00000001) Feb 13 19:46:29.406165 kernel: ACPI: SRAT 0x000000007D9EF530 0000A0 (v01 AMAZON AMZNSRAT 00000001 AMZN 00000001) Feb 13 19:46:29.406180 kernel: ACPI: SLIT 0x000000007D9EF4C0 00006C (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Feb 13 19:46:29.406193 kernel: ACPI: WAET 0x000000007D9EF490 000028 (v01 AMAZON AMZNWAET 00000001 AMZN 00000001) Feb 13 19:46:29.406207 kernel: ACPI: HPET 0x00000000000C9000 000038 (v01 AMAZON AMZNHPET 00000001 AMZN 00000001) Feb 13 19:46:29.406221 kernel: ACPI: SSDT 0x00000000000C9040 00007B (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Feb 13 19:46:29.406235 kernel: ACPI: Reserving FACP table memory at [mem 0x7d9eff80-0x7d9efff3] Feb 13 19:46:29.406254 kernel: ACPI: Reserving DSDT table memory at [mem 0x7d9ee3a0-0x7d9ef488] Feb 13 19:46:29.406274 kernel: ACPI: Reserving FACS table memory at [mem 0x7d9eff40-0x7d9eff7f] Feb 13 19:46:29.406290 kernel: ACPI: Reserving SSDT table memory at [mem 0x7d9ef6c0-0x7d9eff39] Feb 13 19:46:29.406305 kernel: ACPI: Reserving APIC table memory at [mem 0x7d9ef5d0-0x7d9ef645] Feb 13 19:46:29.406321 kernel: ACPI: Reserving SRAT table memory at [mem 0x7d9ef530-0x7d9ef5cf] Feb 13 19:46:29.406339 kernel: ACPI: Reserving SLIT table memory at [mem 0x7d9ef4c0-0x7d9ef52b] Feb 13 19:46:29.406355 kernel: ACPI: Reserving WAET table memory at [mem 0x7d9ef490-0x7d9ef4b7] Feb 13 19:46:29.406371 kernel: ACPI: Reserving HPET table memory at [mem 0xc9000-0xc9037] Feb 13 19:46:29.406386 kernel: ACPI: Reserving SSDT table memory at [mem 0xc9040-0xc90ba] Feb 13 19:46:29.406472 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 19:46:29.406490 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Feb 13 19:46:29.406506 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x7fffffff] Feb 13 19:46:29.406521 kernel: NUMA: Initialized distance table, cnt=1 Feb 13 19:46:29.406536 kernel: NODE_DATA(0) allocated [mem 0x7d9e3000-0x7d9e8fff] Feb 13 19:46:29.406556 kernel: Zone ranges: Feb 13 19:46:29.406571 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 19:46:29.406587 kernel: DMA32 [mem 0x0000000001000000-0x000000007d9e9fff] Feb 13 19:46:29.406603 kernel: Normal empty Feb 13 19:46:29.406617 kernel: Movable zone start for each node Feb 13 19:46:29.406632 kernel: Early memory node ranges Feb 13 19:46:29.406648 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 13 19:46:29.406663 kernel: node 0: [mem 0x0000000000100000-0x000000007d9e9fff] Feb 13 19:46:29.406679 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007d9e9fff] Feb 13 19:46:29.406698 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 19:46:29.406713 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 13 19:46:29.406741 kernel: On node 0, zone DMA32: 9750 pages in unavailable ranges Feb 13 19:46:29.408801 kernel: ACPI: PM-Timer IO Port: 0xb008 Feb 13 19:46:29.408818 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 13 19:46:29.408835 kernel: IOAPIC[0]: apic_id 0, version 32, address 0xfec00000, GSI 0-23 Feb 13 19:46:29.408851 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 13 19:46:29.408867 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 19:46:29.408882 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 13 19:46:29.408903 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 13 19:46:29.408919 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 19:46:29.408935 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 19:46:29.408950 kernel: TSC deadline timer available Feb 13 19:46:29.408965 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Feb 13 19:46:29.408980 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Feb 13 19:46:29.408996 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Feb 13 19:46:29.409011 kernel: Booting paravirtualized kernel on KVM Feb 13 19:46:29.409027 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 19:46:29.409043 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Feb 13 19:46:29.409061 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Feb 13 19:46:29.409077 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Feb 13 19:46:29.409092 kernel: pcpu-alloc: [0] 0 1 Feb 13 19:46:29.409108 kernel: kvm-guest: PV spinlocks enabled Feb 13 19:46:29.409123 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Feb 13 19:46:29.409140 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ed9b5d8ea73d2e47b8decea8124089e04dd398ef43013c1b1a5809314044b1c3 Feb 13 19:46:29.409156 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 19:46:29.409175 kernel: random: crng init done Feb 13 19:46:29.409191 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 19:46:29.409207 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 19:46:29.409222 kernel: Fallback order for Node 0: 0 Feb 13 19:46:29.409238 kernel: Built 1 zonelists, mobility grouping on. Total pages: 506242 Feb 13 19:46:29.409253 kernel: Policy zone: DMA32 Feb 13 19:46:29.409268 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 19:46:29.409285 kernel: Memory: 1932348K/2057760K available (12288K kernel code, 2301K rwdata, 22736K rodata, 42976K init, 2216K bss, 125152K reserved, 0K cma-reserved) Feb 13 19:46:29.409300 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Feb 13 19:46:29.409320 kernel: Kernel/User page tables isolation: enabled Feb 13 19:46:29.409335 kernel: ftrace: allocating 37923 entries in 149 pages Feb 13 19:46:29.409351 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 19:46:29.409365 kernel: Dynamic Preempt: voluntary Feb 13 19:46:29.409380 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 19:46:29.409398 kernel: rcu: RCU event tracing is enabled. Feb 13 19:46:29.409413 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Feb 13 19:46:29.409429 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 19:46:29.409445 kernel: Rude variant of Tasks RCU enabled. Feb 13 19:46:29.409461 kernel: Tracing variant of Tasks RCU enabled. Feb 13 19:46:29.409479 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 19:46:29.409495 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Feb 13 19:46:29.409511 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Feb 13 19:46:29.409527 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 13 19:46:29.409542 kernel: Console: colour VGA+ 80x25 Feb 13 19:46:29.409558 kernel: printk: console [ttyS0] enabled Feb 13 19:46:29.409574 kernel: ACPI: Core revision 20230628 Feb 13 19:46:29.409590 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 30580167144 ns Feb 13 19:46:29.409606 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 19:46:29.409624 kernel: x2apic enabled Feb 13 19:46:29.409641 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 19:46:29.409668 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Feb 13 19:46:29.409688 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499996) Feb 13 19:46:29.409705 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 19:46:29.409721 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 19:46:29.412796 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 19:46:29.412822 kernel: Spectre V2 : Mitigation: Retpolines Feb 13 19:46:29.412839 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 19:46:29.412856 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 13 19:46:29.412873 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Feb 13 19:46:29.412890 kernel: RETBleed: Vulnerable Feb 13 19:46:29.412914 kernel: Speculative Store Bypass: Vulnerable Feb 13 19:46:29.412930 kernel: MDS: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 19:46:29.412947 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 19:46:29.412964 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 19:46:29.412981 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 19:46:29.412996 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 19:46:29.413017 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 19:46:29.413033 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 13 19:46:29.413050 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 13 19:46:29.413067 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Feb 13 19:46:29.413083 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Feb 13 19:46:29.413198 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Feb 13 19:46:29.413216 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Feb 13 19:46:29.413234 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 19:46:29.413251 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 13 19:46:29.413267 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 13 19:46:29.413283 kernel: x86/fpu: xstate_offset[5]: 960, xstate_sizes[5]: 64 Feb 13 19:46:29.413304 kernel: x86/fpu: xstate_offset[6]: 1024, xstate_sizes[6]: 512 Feb 13 19:46:29.413320 kernel: x86/fpu: xstate_offset[7]: 1536, xstate_sizes[7]: 1024 Feb 13 19:46:29.413336 kernel: x86/fpu: xstate_offset[9]: 2560, xstate_sizes[9]: 8 Feb 13 19:46:29.413354 kernel: x86/fpu: Enabled xstate features 0x2ff, context size is 2568 bytes, using 'compacted' format. Feb 13 19:46:29.413371 kernel: Freeing SMP alternatives memory: 32K Feb 13 19:46:29.413387 kernel: pid_max: default: 32768 minimum: 301 Feb 13 19:46:29.413404 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 19:46:29.413420 kernel: landlock: Up and running. Feb 13 19:46:29.413434 kernel: SELinux: Initializing. Feb 13 19:46:29.413447 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:46:29.413459 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:46:29.413479 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz (family: 0x6, model: 0x55, stepping: 0x7) Feb 13 19:46:29.413505 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 19:46:29.413527 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 19:46:29.413547 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 19:46:29.413569 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Feb 13 19:46:29.413589 kernel: signal: max sigframe size: 3632 Feb 13 19:46:29.413610 kernel: rcu: Hierarchical SRCU implementation. Feb 13 19:46:29.413633 kernel: rcu: Max phase no-delay instances is 400. Feb 13 19:46:29.413647 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 19:46:29.413664 kernel: smp: Bringing up secondary CPUs ... Feb 13 19:46:29.413684 kernel: smpboot: x86: Booting SMP configuration: Feb 13 19:46:29.413699 kernel: .... node #0, CPUs: #1 Feb 13 19:46:29.413717 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Feb 13 19:46:29.414380 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 13 19:46:29.414452 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 19:46:29.414468 kernel: smpboot: Max logical packages: 1 Feb 13 19:46:29.414482 kernel: smpboot: Total of 2 processors activated (9999.98 BogoMIPS) Feb 13 19:46:29.414497 kernel: devtmpfs: initialized Feb 13 19:46:29.414515 kernel: x86/mm: Memory block size: 128MB Feb 13 19:46:29.414529 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 19:46:29.414543 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Feb 13 19:46:29.414557 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 19:46:29.414572 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 19:46:29.414585 kernel: audit: initializing netlink subsys (disabled) Feb 13 19:46:29.414599 kernel: audit: type=2000 audit(1739475987.509:1): state=initialized audit_enabled=0 res=1 Feb 13 19:46:29.414614 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 19:46:29.414627 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 19:46:29.414644 kernel: cpuidle: using governor menu Feb 13 19:46:29.414657 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 19:46:29.414671 kernel: dca service started, version 1.12.1 Feb 13 19:46:29.414684 kernel: PCI: Using configuration type 1 for base access Feb 13 19:46:29.414698 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 19:46:29.414712 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 19:46:29.414725 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 19:46:29.414751 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 19:46:29.414765 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 19:46:29.414781 kernel: ACPI: Added _OSI(Module Device) Feb 13 19:46:29.414795 kernel: ACPI: Added _OSI(Processor Device) Feb 13 19:46:29.414808 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 19:46:29.414822 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 19:46:29.414835 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Feb 13 19:46:29.414848 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 19:46:29.414861 kernel: ACPI: Interpreter enabled Feb 13 19:46:29.414875 kernel: ACPI: PM: (supports S0 S5) Feb 13 19:46:29.414889 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 19:46:29.414902 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 19:46:29.414919 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 19:46:29.414933 kernel: ACPI: Enabled 16 GPEs in block 00 to 0F Feb 13 19:46:29.414946 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 13 19:46:29.415184 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Feb 13 19:46:29.418912 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Feb 13 19:46:29.419077 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Feb 13 19:46:29.419098 kernel: acpiphp: Slot [3] registered Feb 13 19:46:29.419120 kernel: acpiphp: Slot [4] registered Feb 13 19:46:29.419136 kernel: acpiphp: Slot [5] registered Feb 13 19:46:29.419153 kernel: acpiphp: Slot [6] registered Feb 13 19:46:29.419169 kernel: acpiphp: Slot [7] registered Feb 13 19:46:29.419185 kernel: acpiphp: Slot [8] registered Feb 13 19:46:29.419200 kernel: acpiphp: Slot [9] registered Feb 13 19:46:29.419216 kernel: acpiphp: Slot [10] registered Feb 13 19:46:29.419232 kernel: acpiphp: Slot [11] registered Feb 13 19:46:29.419249 kernel: acpiphp: Slot [12] registered Feb 13 19:46:29.419268 kernel: acpiphp: Slot [13] registered Feb 13 19:46:29.419284 kernel: acpiphp: Slot [14] registered Feb 13 19:46:29.419300 kernel: acpiphp: Slot [15] registered Feb 13 19:46:29.419315 kernel: acpiphp: Slot [16] registered Feb 13 19:46:29.419331 kernel: acpiphp: Slot [17] registered Feb 13 19:46:29.419347 kernel: acpiphp: Slot [18] registered Feb 13 19:46:29.419362 kernel: acpiphp: Slot [19] registered Feb 13 19:46:29.419378 kernel: acpiphp: Slot [20] registered Feb 13 19:46:29.419394 kernel: acpiphp: Slot [21] registered Feb 13 19:46:29.419410 kernel: acpiphp: Slot [22] registered Feb 13 19:46:29.419428 kernel: acpiphp: Slot [23] registered Feb 13 19:46:29.419444 kernel: acpiphp: Slot [24] registered Feb 13 19:46:29.419460 kernel: acpiphp: Slot [25] registered Feb 13 19:46:29.419476 kernel: acpiphp: Slot [26] registered Feb 13 19:46:29.419492 kernel: acpiphp: Slot [27] registered Feb 13 19:46:29.419507 kernel: acpiphp: Slot [28] registered Feb 13 19:46:29.419523 kernel: acpiphp: Slot [29] registered Feb 13 19:46:29.419539 kernel: acpiphp: Slot [30] registered Feb 13 19:46:29.419555 kernel: acpiphp: Slot [31] registered Feb 13 19:46:29.419573 kernel: PCI host bridge to bus 0000:00 Feb 13 19:46:29.419715 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 19:46:29.419855 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 19:46:29.420006 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 19:46:29.420126 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Feb 13 19:46:29.420246 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 13 19:46:29.420398 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 13 19:46:29.420549 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 13 19:46:29.420693 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x000000 Feb 13 19:46:29.423433 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Feb 13 19:46:29.423600 kernel: pci 0000:00:01.3: quirk: [io 0xb100-0xb10f] claimed by PIIX4 SMB Feb 13 19:46:29.424764 kernel: pci 0000:00:01.3: PIIX4 devres E PIO at fff0-ffff Feb 13 19:46:29.425001 kernel: pci 0000:00:01.3: PIIX4 devres F MMIO at ffc00000-ffffffff Feb 13 19:46:29.425133 kernel: pci 0000:00:01.3: PIIX4 devres G PIO at fff0-ffff Feb 13 19:46:29.425266 kernel: pci 0000:00:01.3: PIIX4 devres H MMIO at ffc00000-ffffffff Feb 13 19:46:29.425509 kernel: pci 0000:00:01.3: PIIX4 devres I PIO at fff0-ffff Feb 13 19:46:29.425634 kernel: pci 0000:00:01.3: PIIX4 devres J PIO at fff0-ffff Feb 13 19:46:29.425782 kernel: pci 0000:00:01.3: quirk_piix4_acpi+0x0/0x180 took 10742 usecs Feb 13 19:46:29.425916 kernel: pci 0000:00:03.0: [1d0f:1111] type 00 class 0x030000 Feb 13 19:46:29.426040 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfe400000-0xfe7fffff pref] Feb 13 19:46:29.426169 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfebe0000-0xfebeffff pref] Feb 13 19:46:29.426294 kernel: pci 0000:00:03.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 19:46:29.426516 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Feb 13 19:46:29.426650 kernel: pci 0000:00:04.0: reg 0x10: [mem 0xfebf0000-0xfebf3fff] Feb 13 19:46:29.430883 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Feb 13 19:46:29.431056 kernel: pci 0000:00:05.0: reg 0x10: [mem 0xfebf4000-0xfebf7fff] Feb 13 19:46:29.431079 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 13 19:46:29.431103 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 13 19:46:29.431121 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 19:46:29.431137 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 13 19:46:29.431154 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 13 19:46:29.431172 kernel: iommu: Default domain type: Translated Feb 13 19:46:29.431189 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 19:46:29.431205 kernel: PCI: Using ACPI for IRQ routing Feb 13 19:46:29.431222 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 19:46:29.431239 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Feb 13 19:46:29.431258 kernel: e820: reserve RAM buffer [mem 0x7d9ea000-0x7fffffff] Feb 13 19:46:29.431395 kernel: pci 0000:00:03.0: vgaarb: setting as boot VGA device Feb 13 19:46:29.431529 kernel: pci 0000:00:03.0: vgaarb: bridge control possible Feb 13 19:46:29.431662 kernel: pci 0000:00:03.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 19:46:29.431683 kernel: vgaarb: loaded Feb 13 19:46:29.431700 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Feb 13 19:46:29.431717 kernel: hpet0: 8 comparators, 32-bit 62.500000 MHz counter Feb 13 19:46:29.431745 kernel: clocksource: Switched to clocksource kvm-clock Feb 13 19:46:29.431762 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 19:46:29.431783 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 19:46:29.431800 kernel: pnp: PnP ACPI init Feb 13 19:46:29.431817 kernel: pnp: PnP ACPI: found 5 devices Feb 13 19:46:29.431833 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 19:46:29.431850 kernel: NET: Registered PF_INET protocol family Feb 13 19:46:29.431867 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 19:46:29.431892 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 19:46:29.431910 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 19:46:29.431930 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 19:46:29.431947 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 19:46:29.431964 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 19:46:29.431980 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:46:29.432059 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:46:29.432076 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 19:46:29.432093 kernel: NET: Registered PF_XDP protocol family Feb 13 19:46:29.432230 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 19:46:29.432353 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 19:46:29.432475 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 19:46:29.432593 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Feb 13 19:46:29.434775 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 19:46:29.434826 kernel: PCI: CLS 0 bytes, default 64 Feb 13 19:46:29.434845 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 19:46:29.434863 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Feb 13 19:46:29.434880 kernel: clocksource: Switched to clocksource tsc Feb 13 19:46:29.434897 kernel: Initialise system trusted keyrings Feb 13 19:46:29.434920 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 19:46:29.434936 kernel: Key type asymmetric registered Feb 13 19:46:29.434953 kernel: Asymmetric key parser 'x509' registered Feb 13 19:46:29.434969 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 19:46:29.434986 kernel: io scheduler mq-deadline registered Feb 13 19:46:29.435002 kernel: io scheduler kyber registered Feb 13 19:46:29.435019 kernel: io scheduler bfq registered Feb 13 19:46:29.435035 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 19:46:29.435052 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 19:46:29.435072 kernel: 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 19:46:29.435089 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 13 19:46:29.435105 kernel: i8042: Warning: Keylock active Feb 13 19:46:29.435122 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 19:46:29.435139 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 19:46:29.435340 kernel: rtc_cmos 00:00: RTC can wake from S4 Feb 13 19:46:29.435468 kernel: rtc_cmos 00:00: registered as rtc0 Feb 13 19:46:29.435592 kernel: rtc_cmos 00:00: setting system clock to 2025-02-13T19:46:28 UTC (1739475988) Feb 13 19:46:29.435718 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Feb 13 19:46:29.435750 kernel: intel_pstate: CPU model not supported Feb 13 19:46:29.435767 kernel: NET: Registered PF_INET6 protocol family Feb 13 19:46:29.435785 kernel: Segment Routing with IPv6 Feb 13 19:46:29.435801 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 19:46:29.435818 kernel: NET: Registered PF_PACKET protocol family Feb 13 19:46:29.435835 kernel: Key type dns_resolver registered Feb 13 19:46:29.435851 kernel: IPI shorthand broadcast: enabled Feb 13 19:46:29.435868 kernel: sched_clock: Marking stable (851002261, 196785245)->(1188617480, -140829974) Feb 13 19:46:29.435898 kernel: registered taskstats version 1 Feb 13 19:46:29.435915 kernel: Loading compiled-in X.509 certificates Feb 13 19:46:29.435932 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 0cc219a306b9e46e583adebba1820decbdc4307b' Feb 13 19:46:29.435948 kernel: Key type .fscrypt registered Feb 13 19:46:29.435964 kernel: Key type fscrypt-provisioning registered Feb 13 19:46:29.435981 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 19:46:29.435997 kernel: ima: Allocated hash algorithm: sha1 Feb 13 19:46:29.436014 kernel: ima: No architecture policies found Feb 13 19:46:29.436033 kernel: clk: Disabling unused clocks Feb 13 19:46:29.436050 kernel: Freeing unused kernel image (initmem) memory: 42976K Feb 13 19:46:29.436067 kernel: Write protecting the kernel read-only data: 36864k Feb 13 19:46:29.436084 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Feb 13 19:46:29.436100 kernel: Run /init as init process Feb 13 19:46:29.436255 kernel: with arguments: Feb 13 19:46:29.436273 kernel: /init Feb 13 19:46:29.436289 kernel: with environment: Feb 13 19:46:29.436306 kernel: HOME=/ Feb 13 19:46:29.436322 kernel: TERM=linux Feb 13 19:46:29.436343 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 19:46:29.436390 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 19:46:29.436411 systemd[1]: Detected virtualization amazon. Feb 13 19:46:29.436429 systemd[1]: Detected architecture x86-64. Feb 13 19:46:29.436522 systemd[1]: Running in initrd. Feb 13 19:46:29.436540 systemd[1]: No hostname configured, using default hostname. Feb 13 19:46:29.436557 systemd[1]: Hostname set to . Feb 13 19:46:29.436580 systemd[1]: Initializing machine ID from VM UUID. Feb 13 19:46:29.436598 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 19:46:29.436681 systemd[1]: Queued start job for default target initrd.target. Feb 13 19:46:29.436702 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:46:29.436721 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:46:29.443969 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 19:46:29.443998 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:46:29.444023 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 19:46:29.444041 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 19:46:29.444061 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 19:46:29.444077 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 19:46:29.444093 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:46:29.444110 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:46:29.444126 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:46:29.444144 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:46:29.444160 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:46:29.444176 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:46:29.444192 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:46:29.444209 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:46:29.444227 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 19:46:29.444244 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Feb 13 19:46:29.444261 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:46:29.444277 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:46:29.444297 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:46:29.444314 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:46:29.444330 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 19:46:29.444437 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:46:29.444457 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 19:46:29.444480 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 19:46:29.444503 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:46:29.444614 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:46:29.444634 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:46:29.444650 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 19:46:29.444667 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:46:29.444681 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 19:46:29.444763 systemd-journald[179]: Collecting audit messages is disabled. Feb 13 19:46:29.444808 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 19:46:29.444828 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 19:46:29.444844 kernel: Bridge firewalling registered Feb 13 19:46:29.444924 systemd-journald[179]: Journal started Feb 13 19:46:29.444959 systemd-journald[179]: Runtime Journal (/run/log/journal/ec205f84d5226aeaf396f965fbaac106) is 4.8M, max 38.6M, 33.7M free. Feb 13 19:46:29.397075 systemd-modules-load[180]: Inserted module 'overlay' Feb 13 19:46:29.619271 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:46:29.444804 systemd-modules-load[180]: Inserted module 'br_netfilter' Feb 13 19:46:29.616241 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:46:29.619566 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:46:29.623152 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 19:46:29.643192 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:46:29.646352 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:46:29.654378 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:46:29.671050 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:46:29.689918 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:46:29.700000 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:46:29.704332 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:46:29.714963 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:46:29.717985 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:46:29.732220 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 19:46:29.760897 dracut-cmdline[214]: dracut-dracut-053 Feb 13 19:46:29.766407 dracut-cmdline[214]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=ed9b5d8ea73d2e47b8decea8124089e04dd398ef43013c1b1a5809314044b1c3 Feb 13 19:46:29.790294 systemd-resolved[213]: Positive Trust Anchors: Feb 13 19:46:29.790314 systemd-resolved[213]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:46:29.790373 systemd-resolved[213]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:46:29.794800 systemd-resolved[213]: Defaulting to hostname 'linux'. Feb 13 19:46:29.796366 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:46:29.797883 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:46:29.928508 kernel: SCSI subsystem initialized Feb 13 19:46:29.944767 kernel: Loading iSCSI transport class v2.0-870. Feb 13 19:46:29.968770 kernel: iscsi: registered transport (tcp) Feb 13 19:46:30.009890 kernel: iscsi: registered transport (qla4xxx) Feb 13 19:46:30.009985 kernel: QLogic iSCSI HBA Driver Feb 13 19:46:30.139605 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 19:46:30.152259 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 19:46:30.181137 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 19:46:30.181218 kernel: device-mapper: uevent: version 1.0.3 Feb 13 19:46:30.181240 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 19:46:30.228768 kernel: raid6: avx512x4 gen() 15972 MB/s Feb 13 19:46:30.245768 kernel: raid6: avx512x2 gen() 10497 MB/s Feb 13 19:46:30.262786 kernel: raid6: avx512x1 gen() 13290 MB/s Feb 13 19:46:30.279763 kernel: raid6: avx2x4 gen() 15077 MB/s Feb 13 19:46:30.296763 kernel: raid6: avx2x2 gen() 14887 MB/s Feb 13 19:46:30.313856 kernel: raid6: avx2x1 gen() 10775 MB/s Feb 13 19:46:30.313943 kernel: raid6: using algorithm avx512x4 gen() 15972 MB/s Feb 13 19:46:30.331766 kernel: raid6: .... xor() 5750 MB/s, rmw enabled Feb 13 19:46:30.331847 kernel: raid6: using avx512x2 recovery algorithm Feb 13 19:46:30.355765 kernel: xor: automatically using best checksumming function avx Feb 13 19:46:30.571758 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 19:46:30.583631 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:46:30.596065 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:46:30.640190 systemd-udevd[398]: Using default interface naming scheme 'v255'. Feb 13 19:46:30.647803 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:46:30.673353 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 19:46:30.758940 dracut-pre-trigger[402]: rd.md=0: removing MD RAID activation Feb 13 19:46:30.858953 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:46:30.866320 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:46:31.011319 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:46:31.030080 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 19:46:31.103721 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 19:46:31.113223 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:46:31.114905 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:46:31.117139 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:46:31.127561 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 19:46:31.189380 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 19:46:31.187343 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:46:31.209871 kernel: ena 0000:00:05.0: ENA device version: 0.10 Feb 13 19:46:31.322495 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Feb 13 19:46:31.322707 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 19:46:31.322752 kernel: AES CTR mode by8 optimization enabled Feb 13 19:46:31.322772 kernel: ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy. Feb 13 19:46:31.322961 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem febf4000, mac addr 06:b9:05:c0:62:f3 Feb 13 19:46:31.310185 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:46:31.310346 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:46:31.324239 (udev-worker)[450]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:46:31.344106 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:46:31.356505 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:46:31.363109 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:46:31.374874 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:46:31.390119 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:46:31.412790 kernel: nvme nvme0: pci function 0000:00:04.0 Feb 13 19:46:31.413150 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 13 19:46:31.432756 kernel: nvme nvme0: 2/0/0 default/read/poll queues Feb 13 19:46:31.440752 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 19:46:31.440955 kernel: GPT:9289727 != 16777215 Feb 13 19:46:31.440983 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 19:46:31.441005 kernel: GPT:9289727 != 16777215 Feb 13 19:46:31.441023 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 19:46:31.441040 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Feb 13 19:46:31.543988 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/nvme0n1p6 scanned by (udev-worker) (455) Feb 13 19:46:31.614755 kernel: BTRFS: device fsid e9c87d9f-3864-4b45-9be4-80a5397f1fc6 devid 1 transid 38 /dev/nvme0n1p3 scanned by (udev-worker) (456) Feb 13 19:46:31.716068 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:46:31.726043 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:46:31.755167 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Feb 13 19:46:31.778206 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Feb 13 19:46:31.797876 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:46:31.812919 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Feb 13 19:46:31.815231 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Feb 13 19:46:31.831494 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Feb 13 19:46:31.846999 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 19:46:31.893464 disk-uuid[626]: Primary Header is updated. Feb 13 19:46:31.893464 disk-uuid[626]: Secondary Entries is updated. Feb 13 19:46:31.893464 disk-uuid[626]: Secondary Header is updated. Feb 13 19:46:31.901339 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Feb 13 19:46:31.911755 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Feb 13 19:46:32.909961 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Feb 13 19:46:32.911254 disk-uuid[627]: The operation has completed successfully. Feb 13 19:46:33.136709 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 19:46:33.137054 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 19:46:33.153971 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 19:46:33.161259 sh[887]: Success Feb 13 19:46:33.183755 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 19:46:33.314683 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 19:46:33.325093 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 19:46:33.330912 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 19:46:33.362784 kernel: BTRFS info (device dm-0): first mount of filesystem e9c87d9f-3864-4b45-9be4-80a5397f1fc6 Feb 13 19:46:33.362861 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:46:33.362882 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 19:46:33.362899 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 19:46:33.363938 kernel: BTRFS info (device dm-0): using free space tree Feb 13 19:46:33.450078 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 19:46:33.474159 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 19:46:33.475833 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Feb 13 19:46:33.489156 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 19:46:33.492010 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 19:46:33.527910 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 84d576e4-038f-4c76-aa8e-6cfd81e812ea Feb 13 19:46:33.527966 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:46:33.527979 kernel: BTRFS info (device nvme0n1p6): using free space tree Feb 13 19:46:33.534755 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Feb 13 19:46:33.547753 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 84d576e4-038f-4c76-aa8e-6cfd81e812ea Feb 13 19:46:33.548401 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 19:46:33.556477 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 19:46:33.565029 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 19:46:33.727001 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:46:33.740640 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:46:33.776445 systemd-networkd[1079]: lo: Link UP Feb 13 19:46:33.776459 systemd-networkd[1079]: lo: Gained carrier Feb 13 19:46:33.780084 systemd-networkd[1079]: Enumeration completed Feb 13 19:46:33.780570 systemd-networkd[1079]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 19:46:33.780577 systemd-networkd[1079]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 19:46:33.783482 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:46:33.785605 systemd[1]: Reached target network.target - Network. Feb 13 19:46:33.792544 systemd-networkd[1079]: eth0: Link UP Feb 13 19:46:33.792549 systemd-networkd[1079]: eth0: Gained carrier Feb 13 19:46:33.792564 systemd-networkd[1079]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 19:46:33.802898 systemd-networkd[1079]: eth0: DHCPv4 address 172.31.23.250/20, gateway 172.31.16.1 acquired from 172.31.16.1 Feb 13 19:46:34.037842 ignition[1004]: Ignition 2.20.0 Feb 13 19:46:34.037856 ignition[1004]: Stage: fetch-offline Feb 13 19:46:34.038101 ignition[1004]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:34.038115 ignition[1004]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:34.043782 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:46:34.041789 ignition[1004]: Ignition finished successfully Feb 13 19:46:34.063767 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Feb 13 19:46:34.122830 ignition[1089]: Ignition 2.20.0 Feb 13 19:46:34.122845 ignition[1089]: Stage: fetch Feb 13 19:46:34.123285 ignition[1089]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:34.123298 ignition[1089]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:34.123418 ignition[1089]: PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:34.190286 ignition[1089]: PUT result: OK Feb 13 19:46:34.202845 ignition[1089]: parsed url from cmdline: "" Feb 13 19:46:34.202858 ignition[1089]: no config URL provided Feb 13 19:46:34.202871 ignition[1089]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 19:46:34.202888 ignition[1089]: no config at "/usr/lib/ignition/user.ign" Feb 13 19:46:34.202917 ignition[1089]: PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:34.206338 ignition[1089]: PUT result: OK Feb 13 19:46:34.206415 ignition[1089]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Feb 13 19:46:34.209557 ignition[1089]: GET result: OK Feb 13 19:46:34.209627 ignition[1089]: parsing config with SHA512: e429bc616aee2270dfb5d52d57e8c677393a108ab51c1e9688bea768e1f365d051f4c7ff791d0a6f6ee49c25bf1f1a8190d9f369e1707f72167de233690cfcbf Feb 13 19:46:34.214198 unknown[1089]: fetched base config from "system" Feb 13 19:46:34.214208 unknown[1089]: fetched base config from "system" Feb 13 19:46:34.215042 ignition[1089]: fetch: fetch complete Feb 13 19:46:34.214215 unknown[1089]: fetched user config from "aws" Feb 13 19:46:34.215049 ignition[1089]: fetch: fetch passed Feb 13 19:46:34.220262 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Feb 13 19:46:34.215120 ignition[1089]: Ignition finished successfully Feb 13 19:46:34.228998 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 19:46:34.283948 ignition[1095]: Ignition 2.20.0 Feb 13 19:46:34.283972 ignition[1095]: Stage: kargs Feb 13 19:46:34.284434 ignition[1095]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:34.284449 ignition[1095]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:34.284566 ignition[1095]: PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:34.288716 ignition[1095]: PUT result: OK Feb 13 19:46:34.308463 ignition[1095]: kargs: kargs passed Feb 13 19:46:34.310753 ignition[1095]: Ignition finished successfully Feb 13 19:46:34.326948 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 19:46:34.350017 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 19:46:34.385887 ignition[1101]: Ignition 2.20.0 Feb 13 19:46:34.385918 ignition[1101]: Stage: disks Feb 13 19:46:34.386530 ignition[1101]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:34.386545 ignition[1101]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:34.386660 ignition[1101]: PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:34.396810 ignition[1101]: PUT result: OK Feb 13 19:46:34.427991 ignition[1101]: disks: disks passed Feb 13 19:46:34.428175 ignition[1101]: Ignition finished successfully Feb 13 19:46:34.438039 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 19:46:34.445469 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 19:46:34.448652 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 19:46:34.451606 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:46:34.453083 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:46:34.454217 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:46:34.468044 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 19:46:34.512323 systemd-fsck[1109]: ROOT: clean, 14/553520 files, 52654/553472 blocks Feb 13 19:46:34.517745 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 19:46:34.532012 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 19:46:34.708226 kernel: EXT4-fs (nvme0n1p9): mounted filesystem c5993b0e-9201-4b44-aa01-79dc9d6c9fc9 r/w with ordered data mode. Quota mode: none. Feb 13 19:46:34.708969 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 19:46:34.710649 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 19:46:34.726877 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:46:34.732361 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 19:46:34.743056 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 19:46:34.750624 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 19:46:34.755446 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:46:34.767421 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 19:46:34.773053 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 scanned by mount (1128) Feb 13 19:46:34.773300 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 19:46:34.780444 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 84d576e4-038f-4c76-aa8e-6cfd81e812ea Feb 13 19:46:34.780511 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:46:34.780530 kernel: BTRFS info (device nvme0n1p6): using free space tree Feb 13 19:46:34.785752 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Feb 13 19:46:34.788192 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:46:35.091421 initrd-setup-root[1152]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 19:46:35.122018 initrd-setup-root[1159]: cut: /sysroot/etc/group: No such file or directory Feb 13 19:46:35.134529 initrd-setup-root[1166]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 19:46:35.151720 initrd-setup-root[1173]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 19:46:35.513072 systemd-networkd[1079]: eth0: Gained IPv6LL Feb 13 19:46:35.524290 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 19:46:35.527990 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 19:46:35.550874 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 19:46:35.565220 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 19:46:35.566597 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 84d576e4-038f-4c76-aa8e-6cfd81e812ea Feb 13 19:46:35.602553 ignition[1241]: INFO : Ignition 2.20.0 Feb 13 19:46:35.603871 ignition[1241]: INFO : Stage: mount Feb 13 19:46:35.604912 ignition[1241]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:35.608355 ignition[1241]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:35.608355 ignition[1241]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:35.610959 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 19:46:35.613672 ignition[1241]: INFO : PUT result: OK Feb 13 19:46:35.615534 ignition[1241]: INFO : mount: mount passed Feb 13 19:46:35.616495 ignition[1241]: INFO : Ignition finished successfully Feb 13 19:46:35.616960 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 19:46:35.625838 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 19:46:35.714021 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:46:35.734761 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 scanned by mount (1253) Feb 13 19:46:35.737284 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 84d576e4-038f-4c76-aa8e-6cfd81e812ea Feb 13 19:46:35.737351 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:46:35.737372 kernel: BTRFS info (device nvme0n1p6): using free space tree Feb 13 19:46:35.742748 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Feb 13 19:46:35.744646 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:46:35.776173 ignition[1270]: INFO : Ignition 2.20.0 Feb 13 19:46:35.776173 ignition[1270]: INFO : Stage: files Feb 13 19:46:35.778499 ignition[1270]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:35.778499 ignition[1270]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:35.778499 ignition[1270]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:35.778499 ignition[1270]: INFO : PUT result: OK Feb 13 19:46:35.797761 ignition[1270]: DEBUG : files: compiled without relabeling support, skipping Feb 13 19:46:35.800696 ignition[1270]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 19:46:35.800696 ignition[1270]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 19:46:35.815158 ignition[1270]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 19:46:35.816984 ignition[1270]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 19:46:35.818624 unknown[1270]: wrote ssh authorized keys file for user: core Feb 13 19:46:35.820209 ignition[1270]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 19:46:35.824579 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Feb 13 19:46:35.824579 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 19:46:35.824579 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:46:35.824579 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:46:35.840994 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:46:35.840994 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:46:35.840994 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:46:35.840994 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Feb 13 19:46:36.326851 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 19:46:36.941643 ignition[1270]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:46:36.947566 ignition[1270]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:46:36.947566 ignition[1270]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:46:36.947566 ignition[1270]: INFO : files: files passed Feb 13 19:46:36.947566 ignition[1270]: INFO : Ignition finished successfully Feb 13 19:46:36.962077 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 19:46:36.974674 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 19:46:36.995969 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 19:46:37.018566 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 19:46:37.018695 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 19:46:37.052189 initrd-setup-root-after-ignition[1298]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:46:37.052189 initrd-setup-root-after-ignition[1298]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:46:37.058557 initrd-setup-root-after-ignition[1302]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:46:37.076724 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:46:37.086481 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 19:46:37.101987 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 19:46:37.133771 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 19:46:37.134005 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 19:46:37.136524 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 19:46:37.141852 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 19:46:37.143255 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 19:46:37.163465 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 19:46:37.214756 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:46:37.229016 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 19:46:37.269837 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:46:37.272519 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:46:37.276962 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 19:46:37.280401 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 19:46:37.282076 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:46:37.291031 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 19:46:37.298311 systemd[1]: Stopped target basic.target - Basic System. Feb 13 19:46:37.306765 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 19:46:37.318234 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:46:37.328690 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 19:46:37.331684 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 19:46:37.334460 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:46:37.338078 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 19:46:37.339767 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 19:46:37.342229 systemd[1]: Stopped target swap.target - Swaps. Feb 13 19:46:37.344684 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 19:46:37.344858 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:46:37.348902 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:46:37.351410 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:46:37.354134 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 19:46:37.355985 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:46:37.359627 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 19:46:37.360685 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 19:46:37.363375 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 19:46:37.364524 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:46:37.367176 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 19:46:37.367287 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 19:46:37.373982 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 19:46:37.374983 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 19:46:37.375123 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:46:37.380849 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 19:46:37.381774 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 19:46:37.383088 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:46:37.386994 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 19:46:37.387218 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:46:37.399621 ignition[1323]: INFO : Ignition 2.20.0 Feb 13 19:46:37.399621 ignition[1323]: INFO : Stage: umount Feb 13 19:46:37.404624 ignition[1323]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:46:37.404624 ignition[1323]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Feb 13 19:46:37.404624 ignition[1323]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Feb 13 19:46:37.404624 ignition[1323]: INFO : PUT result: OK Feb 13 19:46:37.399978 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 19:46:37.410106 ignition[1323]: INFO : umount: umount passed Feb 13 19:46:37.410106 ignition[1323]: INFO : Ignition finished successfully Feb 13 19:46:37.400075 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 19:46:37.412105 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 19:46:37.412206 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 19:46:37.414396 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 19:46:37.414442 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 19:46:37.416867 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 19:46:37.416919 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 19:46:37.417996 systemd[1]: ignition-fetch.service: Deactivated successfully. Feb 13 19:46:37.418035 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Feb 13 19:46:37.419102 systemd[1]: Stopped target network.target - Network. Feb 13 19:46:37.420046 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 19:46:37.420157 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:46:37.421486 systemd[1]: Stopped target paths.target - Path Units. Feb 13 19:46:37.422415 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 19:46:37.440401 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:46:37.444091 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 19:46:37.445366 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 19:46:37.447861 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 19:46:37.447926 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:46:37.456518 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 19:46:37.456575 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:46:37.478009 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 19:46:37.478498 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 19:46:37.482288 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 19:46:37.482373 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 19:46:37.485780 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 19:46:37.489919 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 19:46:37.493875 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 19:46:37.495306 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 19:46:37.495470 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 19:46:37.497704 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 19:46:37.498030 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 19:46:37.513834 systemd-networkd[1079]: eth0: DHCPv6 lease lost Feb 13 19:46:37.516283 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 19:46:37.517700 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 19:46:37.520512 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 19:46:37.520561 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:46:37.551042 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 19:46:37.559207 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 19:46:37.559339 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:46:37.565525 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:46:37.583914 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 19:46:37.585449 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 19:46:37.599664 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 19:46:37.600095 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:46:37.612316 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 19:46:37.612505 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 19:46:37.617698 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 19:46:37.617781 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:46:37.620158 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 19:46:37.620238 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:46:37.625451 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 19:46:37.625535 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 19:46:37.630579 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:46:37.630647 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:46:37.649068 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 19:46:37.650636 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 19:46:37.653009 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:46:37.655388 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 19:46:37.655457 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 19:46:37.658386 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 19:46:37.658519 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:46:37.660736 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 19:46:37.660816 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:46:37.662718 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:46:37.662799 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:46:37.664886 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 19:46:37.664992 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 19:46:37.668680 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 19:46:37.668838 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 19:46:37.683193 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 19:46:37.692153 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 19:46:37.724949 systemd[1]: Switching root. Feb 13 19:46:37.798778 systemd-journald[179]: Received SIGTERM from PID 1 (systemd). Feb 13 19:46:37.799085 systemd-journald[179]: Journal stopped Feb 13 19:46:40.535267 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 19:46:40.535368 kernel: SELinux: policy capability open_perms=1 Feb 13 19:46:40.535404 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 19:46:40.535427 kernel: SELinux: policy capability always_check_network=0 Feb 13 19:46:40.535456 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 19:46:40.535477 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 19:46:40.535499 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 19:46:40.535569 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 19:46:40.535606 kernel: audit: type=1403 audit(1739475998.349:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 19:46:40.535631 systemd[1]: Successfully loaded SELinux policy in 105.300ms. Feb 13 19:46:40.535664 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.619ms. Feb 13 19:46:40.535688 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 19:46:40.535712 systemd[1]: Detected virtualization amazon. Feb 13 19:46:40.538165 systemd[1]: Detected architecture x86-64. Feb 13 19:46:40.538208 systemd[1]: Detected first boot. Feb 13 19:46:40.538230 systemd[1]: Initializing machine ID from VM UUID. Feb 13 19:46:40.538259 zram_generator::config[1365]: No configuration found. Feb 13 19:46:40.538287 systemd[1]: Populated /etc with preset unit settings. Feb 13 19:46:40.538308 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 19:46:40.538330 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 19:46:40.538350 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 19:46:40.538371 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 19:46:40.538395 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 19:46:40.538415 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 19:46:40.538435 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 19:46:40.538455 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 19:46:40.538476 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 19:46:40.538497 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 19:46:40.538517 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 19:46:40.538537 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:46:40.538558 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:46:40.538582 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 19:46:40.538603 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 19:46:40.538624 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 19:46:40.538644 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:46:40.538665 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 19:46:40.538687 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:46:40.538708 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 19:46:40.539440 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 19:46:40.539472 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 19:46:40.539492 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 19:46:40.539511 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:46:40.539530 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:46:40.539548 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:46:40.539573 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:46:40.539593 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 19:46:40.539613 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 19:46:40.539636 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:46:40.539655 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:46:40.539675 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:46:40.539693 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 19:46:40.539712 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 19:46:40.539913 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 19:46:40.539936 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 19:46:40.539954 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:40.539972 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 19:46:40.539994 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 19:46:40.540013 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 19:46:40.540032 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 19:46:40.541556 systemd[1]: Reached target machines.target - Containers. Feb 13 19:46:40.541596 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 19:46:40.541621 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 19:46:40.541645 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:46:40.541666 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 19:46:40.541687 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:46:40.541711 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:46:40.541768 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:46:40.541789 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 19:46:40.541809 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:46:40.541830 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 19:46:40.541850 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 19:46:40.541870 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 19:46:40.541890 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 19:46:40.541914 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 19:46:40.541934 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:46:40.541954 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:46:40.541975 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 19:46:40.541995 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 19:46:40.542014 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:46:40.542122 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 19:46:40.542147 systemd[1]: Stopped verity-setup.service. Feb 13 19:46:40.542170 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:40.542196 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 19:46:40.542215 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 19:46:40.542235 kernel: loop: module loaded Feb 13 19:46:40.542255 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 19:46:40.542274 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 19:46:40.542294 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 19:46:40.542379 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 19:46:40.542403 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 19:46:40.542423 kernel: fuse: init (API version 7.39) Feb 13 19:46:40.542442 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:46:40.542462 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 19:46:40.542481 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 19:46:40.542503 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:46:40.542523 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:46:40.542541 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:46:40.542560 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:46:40.542615 systemd-journald[1454]: Collecting audit messages is disabled. Feb 13 19:46:40.542656 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 19:46:40.542676 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 19:46:40.542697 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:46:40.542721 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:46:40.543218 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:46:40.543241 systemd-journald[1454]: Journal started Feb 13 19:46:40.543278 systemd-journald[1454]: Runtime Journal (/run/log/journal/ec205f84d5226aeaf396f965fbaac106) is 4.8M, max 38.6M, 33.7M free. Feb 13 19:46:39.739907 systemd[1]: Queued start job for default target multi-user.target. Feb 13 19:46:39.803542 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Feb 13 19:46:40.546058 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:46:39.804023 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 19:46:40.551035 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 19:46:40.553255 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 19:46:40.590892 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 19:46:40.626818 kernel: ACPI: bus type drm_connector registered Feb 13 19:46:40.630206 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 19:46:40.638866 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 19:46:40.640303 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 19:46:40.641960 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:46:40.645590 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Feb 13 19:46:40.663012 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 19:46:40.672983 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 19:46:40.675936 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:46:40.690039 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 19:46:40.695261 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 19:46:40.696984 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:46:40.701119 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 19:46:40.703039 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:46:40.741937 systemd-journald[1454]: Time spent on flushing to /var/log/journal/ec205f84d5226aeaf396f965fbaac106 is 43.064ms for 935 entries. Feb 13 19:46:40.741937 systemd-journald[1454]: System Journal (/var/log/journal/ec205f84d5226aeaf396f965fbaac106) is 8.0M, max 195.6M, 187.6M free. Feb 13 19:46:40.797874 systemd-journald[1454]: Received client request to flush runtime journal. Feb 13 19:46:40.712996 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:46:40.725960 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 19:46:40.744046 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 19:46:40.749542 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:46:40.750476 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:46:40.753917 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:46:40.756073 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 19:46:40.758013 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 19:46:40.762164 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 19:46:40.785997 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 19:46:40.803246 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 19:46:40.808057 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 19:46:40.819299 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Feb 13 19:46:40.822357 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 19:46:40.826896 kernel: loop0: detected capacity change from 0 to 140992 Feb 13 19:46:40.862529 udevadm[1497]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Feb 13 19:46:40.880380 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 19:46:40.881454 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Feb 13 19:46:40.901021 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:46:40.963779 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 19:46:40.977144 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 19:46:40.986023 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:46:40.993809 kernel: loop1: detected capacity change from 0 to 62848 Feb 13 19:46:41.035452 systemd-tmpfiles[1510]: ACLs are not supported, ignoring. Feb 13 19:46:41.036202 systemd-tmpfiles[1510]: ACLs are not supported, ignoring. Feb 13 19:46:41.050998 kernel: loop2: detected capacity change from 0 to 210664 Feb 13 19:46:41.061454 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:46:41.206819 kernel: loop3: detected capacity change from 0 to 138184 Feb 13 19:46:41.339104 kernel: loop4: detected capacity change from 0 to 140992 Feb 13 19:46:41.420768 kernel: loop5: detected capacity change from 0 to 62848 Feb 13 19:46:41.468676 kernel: loop6: detected capacity change from 0 to 210664 Feb 13 19:46:41.526762 kernel: loop7: detected capacity change from 0 to 138184 Feb 13 19:46:41.552861 (sd-merge)[1516]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Feb 13 19:46:41.553691 (sd-merge)[1516]: Merged extensions into '/usr'. Feb 13 19:46:41.576401 systemd[1]: Reloading requested from client PID 1490 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 19:46:41.576780 systemd[1]: Reloading... Feb 13 19:46:41.733762 zram_generator::config[1538]: No configuration found. Feb 13 19:46:42.038887 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:46:42.158865 systemd[1]: Reloading finished in 580 ms. Feb 13 19:46:42.229336 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 19:46:42.246199 systemd[1]: Starting ensure-sysext.service... Feb 13 19:46:42.271335 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:46:42.334782 systemd[1]: Reloading requested from client PID 1590 ('systemctl') (unit ensure-sysext.service)... Feb 13 19:46:42.334950 systemd[1]: Reloading... Feb 13 19:46:42.458642 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 19:46:42.465236 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 19:46:42.477698 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 19:46:42.483374 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Feb 13 19:46:42.487930 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Feb 13 19:46:42.510940 systemd-tmpfiles[1591]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:46:42.511922 systemd-tmpfiles[1591]: Skipping /boot Feb 13 19:46:42.524674 zram_generator::config[1616]: No configuration found. Feb 13 19:46:42.564177 systemd-tmpfiles[1591]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:46:42.564193 systemd-tmpfiles[1591]: Skipping /boot Feb 13 19:46:42.753155 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:46:42.755053 ldconfig[1485]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 19:46:42.826975 systemd[1]: Reloading finished in 491 ms. Feb 13 19:46:42.846332 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 19:46:42.848158 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 19:46:42.854833 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:46:42.880275 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:46:42.888042 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 19:46:42.901058 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 19:46:42.907565 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:46:42.917079 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:46:42.929114 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 19:46:42.947880 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:42.948195 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 19:46:42.962151 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:46:42.969451 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:46:42.992237 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:46:42.994989 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:46:43.006238 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 19:46:43.007716 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:43.028218 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:43.029431 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 19:46:43.030671 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:46:43.031552 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:43.058600 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:43.061895 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 19:46:43.079257 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:46:43.092056 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:46:43.092846 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 19:46:43.096174 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:46:43.104971 systemd[1]: Finished ensure-sysext.service. Feb 13 19:46:43.113210 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 19:46:43.135844 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 19:46:43.136851 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:46:43.137043 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:46:43.138048 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:46:43.138258 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:46:43.144505 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:46:43.157035 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 19:46:43.159636 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:46:43.160758 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:46:43.167853 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:46:43.168490 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:46:43.174517 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:46:43.225810 systemd-udevd[1676]: Using default interface naming scheme 'v255'. Feb 13 19:46:43.239701 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 19:46:43.256302 augenrules[1710]: No rules Feb 13 19:46:43.259575 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:46:43.260469 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:46:43.271560 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 19:46:43.275348 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 19:46:43.281743 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 19:46:43.318638 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:46:43.328961 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:46:43.475054 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 19:46:43.484124 systemd-resolved[1675]: Positive Trust Anchors: Feb 13 19:46:43.484146 systemd-resolved[1675]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:46:43.484207 systemd-resolved[1675]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:46:43.491173 (udev-worker)[1728]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:46:43.517195 systemd-resolved[1675]: Defaulting to hostname 'linux'. Feb 13 19:46:43.523528 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:46:43.525130 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:46:43.548026 systemd-networkd[1725]: lo: Link UP Feb 13 19:46:43.548039 systemd-networkd[1725]: lo: Gained carrier Feb 13 19:46:43.550862 systemd-networkd[1725]: Enumeration completed Feb 13 19:46:43.550998 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:46:43.552637 systemd-networkd[1725]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 19:46:43.552647 systemd-networkd[1725]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 19:46:43.554676 systemd[1]: Reached target network.target - Network. Feb 13 19:46:43.557218 systemd-networkd[1725]: eth0: Link UP Feb 13 19:46:43.557654 systemd-networkd[1725]: eth0: Gained carrier Feb 13 19:46:43.557683 systemd-networkd[1725]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 19:46:43.566746 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 19:46:43.573878 systemd-networkd[1725]: eth0: DHCPv4 address 172.31.23.250/20, gateway 172.31.16.1 acquired from 172.31.16.1 Feb 13 19:46:43.675790 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 19:46:43.696756 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0xb100, revision 255 Feb 13 19:46:43.707797 kernel: ACPI: button: Power Button [PWRF] Feb 13 19:46:43.711492 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input4 Feb 13 19:46:43.716088 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input5 Feb 13 19:46:43.717797 kernel: ACPI: button: Sleep Button [SLPF] Feb 13 19:46:43.772775 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 38 scanned by (udev-worker) (1726) Feb 13 19:46:43.979326 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:46:44.036770 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 19:46:44.104583 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Feb 13 19:46:44.116191 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 19:46:44.122476 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 19:46:44.151090 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 19:46:44.210754 lvm[1839]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:46:44.245209 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 19:46:44.394624 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:46:44.405036 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 19:46:44.414911 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:46:44.417603 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:46:44.419162 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 19:46:44.420669 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 19:46:44.422864 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 19:46:44.424404 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 19:46:44.426367 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 19:46:44.427440 lvm[1843]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:46:44.427982 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 19:46:44.428039 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:46:44.429262 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:46:44.432960 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 19:46:44.437103 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 19:46:44.445311 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 19:46:44.447979 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 19:46:44.450040 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 19:46:44.452419 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:46:44.453784 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:46:44.454977 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:46:44.455012 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:46:44.462293 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 19:46:44.466948 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Feb 13 19:46:44.472929 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 19:46:44.505310 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 19:46:44.519630 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 19:46:44.521571 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 19:46:44.525978 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 19:46:44.535072 systemd[1]: Started ntpd.service - Network Time Service. Feb 13 19:46:44.547023 systemd[1]: Starting setup-oem.service - Setup OEM... Feb 13 19:46:44.562024 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 19:46:44.608932 jq[1851]: false Feb 13 19:46:44.627465 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 19:46:44.650109 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 19:46:44.662488 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 19:46:44.666653 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 19:46:44.672354 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 19:46:44.701196 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 19:46:44.713358 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 19:46:44.734605 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 19:46:44.735343 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 19:46:44.913418 dbus-daemon[1850]: [system] SELinux support is enabled Feb 13 19:46:44.923871 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 19:46:44.944831 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 19:46:44.945096 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 19:46:44.969450 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 19:46:44.969515 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 19:46:44.977385 jq[1862]: true Feb 13 19:46:44.977712 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 19:46:44.977772 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 19:46:44.994906 systemd-networkd[1725]: eth0: Gained IPv6LL Feb 13 19:46:45.004908 dbus-daemon[1850]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1725 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Feb 13 19:46:45.006694 (ntainerd)[1877]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 19:46:45.009071 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 19:46:45.009327 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 19:46:45.017127 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 19:46:45.037773 jq[1884]: true Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: ntpd 4.2.8p17@1.4004-o Thu Feb 13 17:07:00 UTC 2025 (1): Starting Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: ---------------------------------------------------- Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: ntp-4 is maintained by Network Time Foundation, Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: corporation. Support and training for ntp-4 are Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: available at https://www.nwtime.org/support Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: ---------------------------------------------------- Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: proto: precision = 0.100 usec (-23) Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: basedate set to 2025-02-01 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: gps base set to 2025-02-02 (week 2352) Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen and drop on 0 v6wildcard [::]:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen normally on 2 lo 127.0.0.1:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen normally on 3 eth0 172.31.23.250:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen normally on 4 lo [::1]:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listen normally on 5 eth0 [fe80::4b9:5ff:fec0:62f3%2]:123 Feb 13 19:46:45.043428 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: Listening on routing socket on fd #22 for interface updates Feb 13 19:46:45.045640 extend-filesystems[1852]: Found loop4 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found loop5 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found loop6 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found loop7 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p1 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p2 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p3 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found usr Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p4 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p6 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p7 Feb 13 19:46:45.045640 extend-filesystems[1852]: Found nvme0n1p9 Feb 13 19:46:45.045640 extend-filesystems[1852]: Checking size of /dev/nvme0n1p9 Feb 13 19:46:45.011441 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 19:46:45.024705 ntpd[1854]: ntpd 4.2.8p17@1.4004-o Thu Feb 13 17:07:00 UTC 2025 (1): Starting Feb 13 19:46:45.124686 update_engine[1861]: I20250213 19:46:45.045269 1861 main.cc:92] Flatcar Update Engine starting Feb 13 19:46:45.124686 update_engine[1861]: I20250213 19:46:45.088217 1861 update_check_scheduler.cc:74] Next update check in 11m15s Feb 13 19:46:45.124989 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Feb 13 19:46:45.124989 ntpd[1854]: 13 Feb 19:46:45 ntpd[1854]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Feb 13 19:46:45.015882 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 19:46:45.024757 ntpd[1854]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Feb 13 19:46:45.040908 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:46:45.024769 ntpd[1854]: ---------------------------------------------------- Feb 13 19:46:45.046215 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 19:46:45.024780 ntpd[1854]: ntp-4 is maintained by Network Time Foundation, Feb 13 19:46:45.088071 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Feb 13 19:46:45.024794 ntpd[1854]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Feb 13 19:46:45.103197 systemd[1]: Started update-engine.service - Update Engine. Feb 13 19:46:45.024807 ntpd[1854]: corporation. Support and training for ntp-4 are Feb 13 19:46:45.115068 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 19:46:45.024817 ntpd[1854]: available at https://www.nwtime.org/support Feb 13 19:46:45.024827 ntpd[1854]: ---------------------------------------------------- Feb 13 19:46:45.030411 ntpd[1854]: proto: precision = 0.100 usec (-23) Feb 13 19:46:45.036746 ntpd[1854]: basedate set to 2025-02-01 Feb 13 19:46:45.036771 ntpd[1854]: gps base set to 2025-02-02 (week 2352) Feb 13 19:46:45.041369 ntpd[1854]: Listen and drop on 0 v6wildcard [::]:123 Feb 13 19:46:45.041428 ntpd[1854]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Feb 13 19:46:45.041622 ntpd[1854]: Listen normally on 2 lo 127.0.0.1:123 Feb 13 19:46:45.041659 ntpd[1854]: Listen normally on 3 eth0 172.31.23.250:123 Feb 13 19:46:45.041701 ntpd[1854]: Listen normally on 4 lo [::1]:123 Feb 13 19:46:45.041767 ntpd[1854]: Listen normally on 5 eth0 [fe80::4b9:5ff:fec0:62f3%2]:123 Feb 13 19:46:45.041803 ntpd[1854]: Listening on routing socket on fd #22 for interface updates Feb 13 19:46:45.094269 ntpd[1854]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Feb 13 19:46:45.094307 ntpd[1854]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Feb 13 19:46:45.153826 systemd[1]: Finished setup-oem.service - Setup OEM. Feb 13 19:46:45.167036 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Feb 13 19:46:45.174837 extend-filesystems[1852]: Resized partition /dev/nvme0n1p9 Feb 13 19:46:45.178905 coreos-metadata[1849]: Feb 13 19:46:45.173 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Feb 13 19:46:45.179281 coreos-metadata[1849]: Feb 13 19:46:45.179 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Feb 13 19:46:45.181569 coreos-metadata[1849]: Feb 13 19:46:45.181 INFO Fetch successful Feb 13 19:46:45.181679 coreos-metadata[1849]: Feb 13 19:46:45.181 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Feb 13 19:46:45.182881 coreos-metadata[1849]: Feb 13 19:46:45.182 INFO Fetch successful Feb 13 19:46:45.183854 coreos-metadata[1849]: Feb 13 19:46:45.183 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Feb 13 19:46:45.186172 coreos-metadata[1849]: Feb 13 19:46:45.186 INFO Fetch successful Feb 13 19:46:45.186275 coreos-metadata[1849]: Feb 13 19:46:45.186 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Feb 13 19:46:45.196861 extend-filesystems[1911]: resize2fs 1.47.1 (20-May-2024) Feb 13 19:46:45.199024 coreos-metadata[1849]: Feb 13 19:46:45.197 INFO Fetch successful Feb 13 19:46:45.199024 coreos-metadata[1849]: Feb 13 19:46:45.197 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Feb 13 19:46:45.203710 coreos-metadata[1849]: Feb 13 19:46:45.203 INFO Fetch failed with 404: resource not found Feb 13 19:46:45.203710 coreos-metadata[1849]: Feb 13 19:46:45.203 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Feb 13 19:46:45.205247 coreos-metadata[1849]: Feb 13 19:46:45.204 INFO Fetch successful Feb 13 19:46:45.205247 coreos-metadata[1849]: Feb 13 19:46:45.204 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Feb 13 19:46:45.218785 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Feb 13 19:46:45.218889 coreos-metadata[1849]: Feb 13 19:46:45.205 INFO Fetch successful Feb 13 19:46:45.218889 coreos-metadata[1849]: Feb 13 19:46:45.205 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Feb 13 19:46:45.219839 coreos-metadata[1849]: Feb 13 19:46:45.219 INFO Fetch successful Feb 13 19:46:45.219839 coreos-metadata[1849]: Feb 13 19:46:45.219 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Feb 13 19:46:45.222465 coreos-metadata[1849]: Feb 13 19:46:45.222 INFO Fetch successful Feb 13 19:46:45.222465 coreos-metadata[1849]: Feb 13 19:46:45.222 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Feb 13 19:46:45.224670 coreos-metadata[1849]: Feb 13 19:46:45.223 INFO Fetch successful Feb 13 19:46:45.333677 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 19:46:45.343249 systemd-logind[1858]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 19:46:45.343304 systemd-logind[1858]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 13 19:46:45.343329 systemd-logind[1858]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 19:46:45.353188 systemd-logind[1858]: New seat seat0. Feb 13 19:46:45.365521 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Feb 13 19:46:45.396854 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 19:46:45.413461 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Feb 13 19:46:45.414131 extend-filesystems[1911]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Feb 13 19:46:45.414131 extend-filesystems[1911]: old_desc_blocks = 1, new_desc_blocks = 1 Feb 13 19:46:45.414131 extend-filesystems[1911]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Feb 13 19:46:45.422032 extend-filesystems[1852]: Resized filesystem in /dev/nvme0n1p9 Feb 13 19:46:45.420804 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 19:46:45.421060 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 19:46:45.436983 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 19:46:45.443945 bash[1934]: Updated "/home/core/.ssh/authorized_keys" Feb 13 19:46:45.445947 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 19:46:45.464109 systemd[1]: Starting sshkeys.service... Feb 13 19:46:45.523753 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 38 scanned by (udev-worker) (1723) Feb 13 19:46:45.566860 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Feb 13 19:46:45.597880 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Feb 13 19:46:45.658862 amazon-ssm-agent[1907]: Initializing new seelog logger Feb 13 19:46:45.658862 amazon-ssm-agent[1907]: New Seelog Logger Creation Complete Feb 13 19:46:45.659871 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.659871 amazon-ssm-agent[1907]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.659871 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 processing appconfig overrides Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 processing appconfig overrides Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO Proxy environment variables: Feb 13 19:46:45.671027 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 processing appconfig overrides Feb 13 19:46:45.685968 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.685968 amazon-ssm-agent[1907]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Feb 13 19:46:45.685968 amazon-ssm-agent[1907]: 2025/02/13 19:46:45 processing appconfig overrides Feb 13 19:46:45.769473 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.hostname1' Feb 13 19:46:45.769775 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Feb 13 19:46:45.772602 dbus-daemon[1850]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1896 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Feb 13 19:46:45.779399 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO http_proxy: Feb 13 19:46:45.784008 systemd[1]: Starting polkit.service - Authorization Manager... Feb 13 19:46:45.876171 polkitd[2009]: Started polkitd version 121 Feb 13 19:46:45.889460 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO no_proxy: Feb 13 19:46:45.915387 polkitd[2009]: Loading rules from directory /etc/polkit-1/rules.d Feb 13 19:46:45.915492 polkitd[2009]: Loading rules from directory /usr/share/polkit-1/rules.d Feb 13 19:46:45.927693 polkitd[2009]: Finished loading, compiling and executing 2 rules Feb 13 19:46:45.936343 dbus-daemon[1850]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Feb 13 19:46:45.937238 systemd[1]: Started polkit.service - Authorization Manager. Feb 13 19:46:45.942316 polkitd[2009]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Feb 13 19:46:45.956658 locksmithd[1900]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 19:46:46.015012 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO https_proxy: Feb 13 19:46:46.028684 coreos-metadata[1958]: Feb 13 19:46:46.028 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Feb 13 19:46:46.029916 coreos-metadata[1958]: Feb 13 19:46:46.029 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Feb 13 19:46:46.042179 coreos-metadata[1958]: Feb 13 19:46:46.034 INFO Fetch successful Feb 13 19:46:46.042179 coreos-metadata[1958]: Feb 13 19:46:46.034 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Feb 13 19:46:46.042179 coreos-metadata[1958]: Feb 13 19:46:46.040 INFO Fetch successful Feb 13 19:46:46.047152 unknown[1958]: wrote ssh authorized keys file for user: core Feb 13 19:46:46.072173 systemd-hostnamed[1896]: Hostname set to (transient) Feb 13 19:46:46.074626 systemd-resolved[1675]: System hostname changed to 'ip-172-31-23-250'. Feb 13 19:46:46.114757 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO Checking if agent identity type OnPrem can be assumed Feb 13 19:46:46.131028 update-ssh-keys[2064]: Updated "/home/core/.ssh/authorized_keys" Feb 13 19:46:46.130324 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Feb 13 19:46:46.141405 systemd[1]: Finished sshkeys.service. Feb 13 19:46:46.213500 amazon-ssm-agent[1907]: 2025-02-13 19:46:45 INFO Checking if agent identity type EC2 can be assumed Feb 13 19:46:46.312082 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO Agent will take identity from EC2 Feb 13 19:46:46.312885 containerd[1877]: time="2025-02-13T19:46:46.312778279Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 19:46:46.413231 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] using named pipe channel for IPC Feb 13 19:46:46.452400 containerd[1877]: time="2025-02-13T19:46:46.450848117Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.456262 containerd[1877]: time="2025-02-13T19:46:46.456203083Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:46:46.456262 containerd[1877]: time="2025-02-13T19:46:46.456259039Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 19:46:46.456407 containerd[1877]: time="2025-02-13T19:46:46.456287335Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 19:46:46.456500 containerd[1877]: time="2025-02-13T19:46:46.456479677Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 19:46:46.456563 containerd[1877]: time="2025-02-13T19:46:46.456510406Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.456602 containerd[1877]: time="2025-02-13T19:46:46.456587590Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:46:46.456640 containerd[1877]: time="2025-02-13T19:46:46.456607455Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.457033 containerd[1877]: time="2025-02-13T19:46:46.457001402Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:46:46.457104 containerd[1877]: time="2025-02-13T19:46:46.457034627Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.457104 containerd[1877]: time="2025-02-13T19:46:46.457053806Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:46:46.457104 containerd[1877]: time="2025-02-13T19:46:46.457066868Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.457267 containerd[1877]: time="2025-02-13T19:46:46.457183441Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.458750 containerd[1877]: time="2025-02-13T19:46:46.457440760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:46:46.458750 containerd[1877]: time="2025-02-13T19:46:46.457601668Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:46:46.458750 containerd[1877]: time="2025-02-13T19:46:46.457620734Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 19:46:46.458750 containerd[1877]: time="2025-02-13T19:46:46.457707732Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 19:46:46.458750 containerd[1877]: time="2025-02-13T19:46:46.457778759Z" level=info msg="metadata content store policy set" policy=shared Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465184134Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465288740Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465313910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465414900Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465439671Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 19:46:46.466806 containerd[1877]: time="2025-02-13T19:46:46.465683133Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 19:46:46.467254 containerd[1877]: time="2025-02-13T19:46:46.467224215Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 19:46:46.467463 containerd[1877]: time="2025-02-13T19:46:46.467442923Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 19:46:46.467523 containerd[1877]: time="2025-02-13T19:46:46.467470168Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 19:46:46.467523 containerd[1877]: time="2025-02-13T19:46:46.467510820Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 19:46:46.467604 containerd[1877]: time="2025-02-13T19:46:46.467533176Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467604 containerd[1877]: time="2025-02-13T19:46:46.467553777Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467604 containerd[1877]: time="2025-02-13T19:46:46.467590274Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467709 containerd[1877]: time="2025-02-13T19:46:46.467619687Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467709 containerd[1877]: time="2025-02-13T19:46:46.467660608Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467709 containerd[1877]: time="2025-02-13T19:46:46.467681937Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.467709 containerd[1877]: time="2025-02-13T19:46:46.467700250Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.469802 containerd[1877]: time="2025-02-13T19:46:46.467718637Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 19:46:46.469909 containerd[1877]: time="2025-02-13T19:46:46.469845269Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.469909 containerd[1877]: time="2025-02-13T19:46:46.469876854Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470007 containerd[1877]: time="2025-02-13T19:46:46.469896791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470007 containerd[1877]: time="2025-02-13T19:46:46.469934599Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470007 containerd[1877]: time="2025-02-13T19:46:46.469954571Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470007 containerd[1877]: time="2025-02-13T19:46:46.469974151Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470162 containerd[1877]: time="2025-02-13T19:46:46.470020852Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470162 containerd[1877]: time="2025-02-13T19:46:46.470043793Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470162 containerd[1877]: time="2025-02-13T19:46:46.470079777Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470162 containerd[1877]: time="2025-02-13T19:46:46.470104848Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470162 containerd[1877]: time="2025-02-13T19:46:46.470129897Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470331 containerd[1877]: time="2025-02-13T19:46:46.470166072Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470331 containerd[1877]: time="2025-02-13T19:46:46.470197049Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470331 containerd[1877]: time="2025-02-13T19:46:46.470234581Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 19:46:46.470331 containerd[1877]: time="2025-02-13T19:46:46.470273742Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470331 containerd[1877]: time="2025-02-13T19:46:46.470311426Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470603 containerd[1877]: time="2025-02-13T19:46:46.470329460Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 19:46:46.470603 containerd[1877]: time="2025-02-13T19:46:46.470422567Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 19:46:46.470681 containerd[1877]: time="2025-02-13T19:46:46.470602192Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 19:46:46.470681 containerd[1877]: time="2025-02-13T19:46:46.470623828Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 19:46:46.470681 containerd[1877]: time="2025-02-13T19:46:46.470642732Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 19:46:46.470681 containerd[1877]: time="2025-02-13T19:46:46.470657192Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.470681 containerd[1877]: time="2025-02-13T19:46:46.470677878Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 19:46:46.470875 containerd[1877]: time="2025-02-13T19:46:46.470693496Z" level=info msg="NRI interface is disabled by configuration." Feb 13 19:46:46.470875 containerd[1877]: time="2025-02-13T19:46:46.470709784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 19:46:46.472812 containerd[1877]: time="2025-02-13T19:46:46.471155382Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 19:46:46.472812 containerd[1877]: time="2025-02-13T19:46:46.471230885Z" level=info msg="Connect containerd service" Feb 13 19:46:46.472812 containerd[1877]: time="2025-02-13T19:46:46.471281142Z" level=info msg="using legacy CRI server" Feb 13 19:46:46.472812 containerd[1877]: time="2025-02-13T19:46:46.471292078Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 19:46:46.472812 containerd[1877]: time="2025-02-13T19:46:46.471489875Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.474010597Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.474467754Z" level=info msg="Start subscribing containerd event" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.474522908Z" level=info msg="Start recovering state" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.474603728Z" level=info msg="Start event monitor" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.474617527Z" level=info msg="Start snapshots syncer" Feb 13 19:46:46.476755 containerd[1877]: time="2025-02-13T19:46:46.476759693Z" level=info msg="Start cni network conf syncer for default" Feb 13 19:46:46.477048 containerd[1877]: time="2025-02-13T19:46:46.476779075Z" level=info msg="Start streaming server" Feb 13 19:46:46.478388 containerd[1877]: time="2025-02-13T19:46:46.477136082Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 19:46:46.478388 containerd[1877]: time="2025-02-13T19:46:46.477203759Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 19:46:46.477380 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 19:46:46.480752 containerd[1877]: time="2025-02-13T19:46:46.479948857Z" level=info msg="containerd successfully booted in 0.172307s" Feb 13 19:46:46.512390 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] using named pipe channel for IPC Feb 13 19:46:46.550395 sshd_keygen[1898]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 19:46:46.589235 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 19:46:46.602193 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 19:46:46.610930 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] using named pipe channel for IPC Feb 13 19:46:46.613151 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 19:46:46.615759 systemd[1]: Started sshd@0-172.31.23.250:22-139.178.89.65:43206.service - OpenSSH per-connection server daemon (139.178.89.65:43206). Feb 13 19:46:46.633038 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 19:46:46.633795 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 19:46:46.644870 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 19:46:46.685405 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 19:46:46.712937 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Feb 13 19:46:46.719039 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 19:46:46.743413 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 19:46:46.746374 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 19:46:46.813434 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] OS: linux, Arch: amd64 Feb 13 19:46:46.915341 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] Starting Core Agent Feb 13 19:46:46.964592 sshd[2082]: Accepted publickey for core from 139.178.89.65 port 43206 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:46.973296 sshd-session[2082]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:46.992513 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 19:46:47.003677 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 19:46:47.013264 systemd-logind[1858]: New session 1 of user core. Feb 13 19:46:47.018754 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [amazon-ssm-agent] registrar detected. Attempting registration Feb 13 19:46:47.044102 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 19:46:47.055936 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 19:46:47.099094 (systemd)[2093]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 19:46:47.117756 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [Registrar] Starting registrar module Feb 13 19:46:47.146093 amazon-ssm-agent[1907]: 2025-02-13 19:46:46 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Feb 13 19:46:47.146093 amazon-ssm-agent[1907]: 2025-02-13 19:46:47 INFO [EC2Identity] EC2 registration was successful. Feb 13 19:46:47.146093 amazon-ssm-agent[1907]: 2025-02-13 19:46:47 INFO [CredentialRefresher] credentialRefresher has started Feb 13 19:46:47.146093 amazon-ssm-agent[1907]: 2025-02-13 19:46:47 INFO [CredentialRefresher] Starting credentials refresher loop Feb 13 19:46:47.146093 amazon-ssm-agent[1907]: 2025-02-13 19:46:47 INFO EC2RoleProvider Successfully connected with instance profile role credentials Feb 13 19:46:47.218976 amazon-ssm-agent[1907]: 2025-02-13 19:46:47 INFO [CredentialRefresher] Next credential rotation will be in 31.2499886205 minutes Feb 13 19:46:47.485656 systemd[2093]: Queued start job for default target default.target. Feb 13 19:46:47.499110 systemd[2093]: Created slice app.slice - User Application Slice. Feb 13 19:46:47.499167 systemd[2093]: Reached target paths.target - Paths. Feb 13 19:46:47.499189 systemd[2093]: Reached target timers.target - Timers. Feb 13 19:46:47.501242 systemd[2093]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 19:46:47.518771 systemd[2093]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 19:46:47.518930 systemd[2093]: Reached target sockets.target - Sockets. Feb 13 19:46:47.518952 systemd[2093]: Reached target basic.target - Basic System. Feb 13 19:46:47.519009 systemd[2093]: Reached target default.target - Main User Target. Feb 13 19:46:47.519049 systemd[2093]: Startup finished in 393ms. Feb 13 19:46:47.519277 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 19:46:47.528091 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 19:46:47.709305 systemd[1]: Started sshd@1-172.31.23.250:22-139.178.89.65:43210.service - OpenSSH per-connection server daemon (139.178.89.65:43210). Feb 13 19:46:47.988209 sshd[2104]: Accepted publickey for core from 139.178.89.65 port 43210 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:47.989700 sshd-session[2104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:48.024367 systemd-logind[1858]: New session 2 of user core. Feb 13 19:46:48.037034 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 19:46:48.161899 amazon-ssm-agent[1907]: 2025-02-13 19:46:48 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Feb 13 19:46:48.167594 sshd[2106]: Connection closed by 139.178.89.65 port 43210 Feb 13 19:46:48.170004 sshd-session[2104]: pam_unix(sshd:session): session closed for user core Feb 13 19:46:48.178375 systemd[1]: sshd@1-172.31.23.250:22-139.178.89.65:43210.service: Deactivated successfully. Feb 13 19:46:48.189886 systemd[1]: session-2.scope: Deactivated successfully. Feb 13 19:46:48.194637 systemd-logind[1858]: Session 2 logged out. Waiting for processes to exit. Feb 13 19:46:48.217338 systemd[1]: Started sshd@2-172.31.23.250:22-139.178.89.65:43212.service - OpenSSH per-connection server daemon (139.178.89.65:43212). Feb 13 19:46:48.251385 systemd-logind[1858]: Removed session 2. Feb 13 19:46:48.270798 amazon-ssm-agent[1907]: 2025-02-13 19:46:48 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2109) started Feb 13 19:46:48.373693 amazon-ssm-agent[1907]: 2025-02-13 19:46:48 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Feb 13 19:46:48.481456 sshd[2113]: Accepted publickey for core from 139.178.89.65 port 43212 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:48.482511 sshd-session[2113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:48.504143 systemd-logind[1858]: New session 3 of user core. Feb 13 19:46:48.512269 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 19:46:48.658412 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:46:48.664338 sshd[2125]: Connection closed by 139.178.89.65 port 43212 Feb 13 19:46:48.664973 sshd-session[2113]: pam_unix(sshd:session): session closed for user core Feb 13 19:46:48.665771 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 19:46:48.682965 systemd[1]: Startup finished in 1.103s (kernel) + 9.439s (initrd) + 10.436s (userspace) = 20.979s. Feb 13 19:46:48.684318 (kubelet)[2132]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:46:48.690220 systemd-logind[1858]: Session 3 logged out. Waiting for processes to exit. Feb 13 19:46:48.691091 systemd[1]: sshd@2-172.31.23.250:22-139.178.89.65:43212.service: Deactivated successfully. Feb 13 19:46:48.695047 systemd[1]: session-3.scope: Deactivated successfully. Feb 13 19:46:48.698338 systemd-logind[1858]: Removed session 3. Feb 13 19:46:50.573328 kubelet[2132]: E0213 19:46:50.573268 2132 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:46:50.577589 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:46:50.578775 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:46:50.579712 systemd[1]: kubelet.service: Consumed 1.073s CPU time. Feb 13 19:46:58.702384 systemd[1]: Started sshd@3-172.31.23.250:22-139.178.89.65:52408.service - OpenSSH per-connection server daemon (139.178.89.65:52408). Feb 13 19:46:58.882696 sshd[2147]: Accepted publickey for core from 139.178.89.65 port 52408 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:58.884296 sshd-session[2147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:58.889790 systemd-logind[1858]: New session 4 of user core. Feb 13 19:46:58.895012 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 19:46:59.025079 sshd[2149]: Connection closed by 139.178.89.65 port 52408 Feb 13 19:46:59.025748 sshd-session[2147]: pam_unix(sshd:session): session closed for user core Feb 13 19:46:59.038444 systemd[1]: sshd@3-172.31.23.250:22-139.178.89.65:52408.service: Deactivated successfully. Feb 13 19:46:59.041363 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 19:46:59.042282 systemd-logind[1858]: Session 4 logged out. Waiting for processes to exit. Feb 13 19:46:59.068384 systemd[1]: Started sshd@4-172.31.23.250:22-139.178.89.65:52416.service - OpenSSH per-connection server daemon (139.178.89.65:52416). Feb 13 19:46:59.083593 systemd-logind[1858]: Removed session 4. Feb 13 19:46:59.295325 sshd[2154]: Accepted publickey for core from 139.178.89.65 port 52416 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:59.296496 sshd-session[2154]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:59.312971 systemd-logind[1858]: New session 5 of user core. Feb 13 19:46:59.323006 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 19:46:59.450701 sshd[2156]: Connection closed by 139.178.89.65 port 52416 Feb 13 19:46:59.451422 sshd-session[2154]: pam_unix(sshd:session): session closed for user core Feb 13 19:46:59.461274 systemd[1]: sshd@4-172.31.23.250:22-139.178.89.65:52416.service: Deactivated successfully. Feb 13 19:46:59.466139 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 19:46:59.476172 systemd-logind[1858]: Session 5 logged out. Waiting for processes to exit. Feb 13 19:46:59.504986 systemd[1]: Started sshd@5-172.31.23.250:22-139.178.89.65:52424.service - OpenSSH per-connection server daemon (139.178.89.65:52424). Feb 13 19:46:59.506615 systemd-logind[1858]: Removed session 5. Feb 13 19:46:59.737239 sshd[2161]: Accepted publickey for core from 139.178.89.65 port 52424 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:46:59.741368 sshd-session[2161]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:46:59.766668 systemd-logind[1858]: New session 6 of user core. Feb 13 19:46:59.770930 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 19:46:59.933207 sshd[2163]: Connection closed by 139.178.89.65 port 52424 Feb 13 19:46:59.934139 sshd-session[2161]: pam_unix(sshd:session): session closed for user core Feb 13 19:46:59.944933 systemd[1]: sshd@5-172.31.23.250:22-139.178.89.65:52424.service: Deactivated successfully. Feb 13 19:46:59.953392 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 19:46:59.961936 systemd-logind[1858]: Session 6 logged out. Waiting for processes to exit. Feb 13 19:46:59.997249 systemd[1]: Started sshd@6-172.31.23.250:22-139.178.89.65:52434.service - OpenSSH per-connection server daemon (139.178.89.65:52434). Feb 13 19:46:59.998675 systemd-logind[1858]: Removed session 6. Feb 13 19:47:00.245849 sshd[2168]: Accepted publickey for core from 139.178.89.65 port 52434 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:47:00.247290 sshd-session[2168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:47:00.261604 systemd-logind[1858]: New session 7 of user core. Feb 13 19:47:00.277008 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 19:47:00.434577 sudo[2171]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 19:47:00.435003 sudo[2171]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:47:00.482906 sudo[2171]: pam_unix(sudo:session): session closed for user root Feb 13 19:47:00.510859 sshd[2170]: Connection closed by 139.178.89.65 port 52434 Feb 13 19:47:00.511894 sshd-session[2168]: pam_unix(sshd:session): session closed for user core Feb 13 19:47:00.531971 systemd[1]: sshd@6-172.31.23.250:22-139.178.89.65:52434.service: Deactivated successfully. Feb 13 19:47:00.550703 systemd-logind[1858]: Session 7 logged out. Waiting for processes to exit. Feb 13 19:47:00.554000 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 19:47:00.596188 systemd[1]: Started sshd@7-172.31.23.250:22-139.178.89.65:52440.service - OpenSSH per-connection server daemon (139.178.89.65:52440). Feb 13 19:47:00.600825 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 19:47:00.610722 systemd-logind[1858]: Removed session 7. Feb 13 19:47:00.618296 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:47:00.826330 sshd[2176]: Accepted publickey for core from 139.178.89.65 port 52440 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:47:00.829854 sshd-session[2176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:47:00.841150 systemd-logind[1858]: New session 8 of user core. Feb 13 19:47:00.847081 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 19:47:00.975064 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:47:00.983469 sudo[2187]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 19:47:00.984191 sudo[2187]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:47:00.996111 (kubelet)[2188]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:47:01.000509 sudo[2187]: pam_unix(sudo:session): session closed for user root Feb 13 19:47:01.022030 sudo[2186]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 19:47:01.023317 sudo[2186]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:47:01.069757 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:47:01.145189 kubelet[2188]: E0213 19:47:01.145142 2188 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:47:01.153768 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:47:01.153977 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:47:01.168170 augenrules[2218]: No rules Feb 13 19:47:01.170534 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:47:01.171065 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:47:01.174399 sudo[2186]: pam_unix(sudo:session): session closed for user root Feb 13 19:47:01.199402 sshd[2181]: Connection closed by 139.178.89.65 port 52440 Feb 13 19:47:01.201126 sshd-session[2176]: pam_unix(sshd:session): session closed for user core Feb 13 19:47:01.206467 systemd[1]: sshd@7-172.31.23.250:22-139.178.89.65:52440.service: Deactivated successfully. Feb 13 19:47:01.210816 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 19:47:01.213143 systemd-logind[1858]: Session 8 logged out. Waiting for processes to exit. Feb 13 19:47:01.217974 systemd-logind[1858]: Removed session 8. Feb 13 19:47:01.230460 systemd[1]: Started sshd@8-172.31.23.250:22-139.178.89.65:52452.service - OpenSSH per-connection server daemon (139.178.89.65:52452). Feb 13 19:47:01.444617 sshd[2226]: Accepted publickey for core from 139.178.89.65 port 52452 ssh2: RSA SHA256:8P+kPxi1I257RCRHId8CcpewLV4ndpYsy+CU1pFADU8 Feb 13 19:47:01.449162 sshd-session[2226]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:47:01.470754 systemd-logind[1858]: New session 9 of user core. Feb 13 19:47:01.477070 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 19:47:01.593848 sudo[2229]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 19:47:01.594270 sudo[2229]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:47:03.890103 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:47:03.913427 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:47:04.035940 systemd[1]: Reloading requested from client PID 2267 ('systemctl') (unit session-9.scope)... Feb 13 19:47:04.035971 systemd[1]: Reloading... Feb 13 19:47:04.387788 zram_generator::config[2310]: No configuration found. Feb 13 19:47:04.662755 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:47:04.895802 systemd[1]: Reloading finished in 858 ms. Feb 13 19:47:04.986395 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 19:47:04.986504 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 19:47:04.986825 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:47:05.005218 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:47:05.378978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:47:05.408303 (kubelet)[2366]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 19:47:05.565756 kubelet[2366]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:47:05.565756 kubelet[2366]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 19:47:05.565756 kubelet[2366]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:47:05.565756 kubelet[2366]: I0213 19:47:05.562672 2366 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 19:47:06.582818 kubelet[2366]: I0213 19:47:06.580724 2366 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 19:47:06.582818 kubelet[2366]: I0213 19:47:06.580777 2366 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 19:47:06.582818 kubelet[2366]: I0213 19:47:06.581065 2366 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 19:47:06.609706 kubelet[2366]: I0213 19:47:06.609656 2366 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:47:06.631722 kubelet[2366]: I0213 19:47:06.631664 2366 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 19:47:06.632996 kubelet[2366]: I0213 19:47:06.632092 2366 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 19:47:06.632996 kubelet[2366]: I0213 19:47:06.632150 2366 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"172.31.23.250","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 19:47:06.634978 kubelet[2366]: I0213 19:47:06.634610 2366 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 19:47:06.634978 kubelet[2366]: I0213 19:47:06.634651 2366 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 19:47:06.634978 kubelet[2366]: I0213 19:47:06.634835 2366 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:47:06.636544 kubelet[2366]: I0213 19:47:06.636165 2366 kubelet.go:400] "Attempting to sync node with API server" Feb 13 19:47:06.636544 kubelet[2366]: I0213 19:47:06.636201 2366 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 19:47:06.636544 kubelet[2366]: I0213 19:47:06.636233 2366 kubelet.go:312] "Adding apiserver pod source" Feb 13 19:47:06.636544 kubelet[2366]: I0213 19:47:06.636258 2366 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 19:47:06.641481 kubelet[2366]: E0213 19:47:06.641073 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:06.641481 kubelet[2366]: E0213 19:47:06.641465 2366 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:06.643574 kubelet[2366]: I0213 19:47:06.643519 2366 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 19:47:06.645447 kubelet[2366]: I0213 19:47:06.645421 2366 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 19:47:06.645563 kubelet[2366]: W0213 19:47:06.645497 2366 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 19:47:06.646265 kubelet[2366]: I0213 19:47:06.646180 2366 server.go:1264] "Started kubelet" Feb 13 19:47:06.648116 kubelet[2366]: I0213 19:47:06.647620 2366 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 19:47:06.659984 kubelet[2366]: I0213 19:47:06.659913 2366 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 19:47:06.661899 kubelet[2366]: I0213 19:47:06.661772 2366 server.go:455] "Adding debug handlers to kubelet server" Feb 13 19:47:06.665693 kubelet[2366]: I0213 19:47:06.665640 2366 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 19:47:06.665937 kubelet[2366]: I0213 19:47:06.665907 2366 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 19:47:06.670163 kubelet[2366]: I0213 19:47:06.668923 2366 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 19:47:06.670163 kubelet[2366]: I0213 19:47:06.669510 2366 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 19:47:06.670163 kubelet[2366]: I0213 19:47:06.669587 2366 reconciler.go:26] "Reconciler: start to sync state" Feb 13 19:47:06.671610 kubelet[2366]: I0213 19:47:06.671581 2366 factory.go:221] Registration of the systemd container factory successfully Feb 13 19:47:06.671764 kubelet[2366]: I0213 19:47:06.671710 2366 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 19:47:06.673711 kubelet[2366]: E0213 19:47:06.673692 2366 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 19:47:06.674116 kubelet[2366]: I0213 19:47:06.674095 2366 factory.go:221] Registration of the containerd container factory successfully Feb 13 19:47:06.686760 kubelet[2366]: W0213 19:47:06.686203 2366 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 19:47:06.686760 kubelet[2366]: E0213 19:47:06.686284 2366 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 19:47:06.686760 kubelet[2366]: W0213 19:47:06.686703 2366 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "172.31.23.250" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 19:47:06.687062 kubelet[2366]: E0213 19:47:06.687044 2366 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes "172.31.23.250" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 19:47:06.690358 kubelet[2366]: I0213 19:47:06.690335 2366 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 19:47:06.690783 kubelet[2366]: I0213 19:47:06.690598 2366 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 19:47:06.690783 kubelet[2366]: I0213 19:47:06.690627 2366 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:47:06.698273 kubelet[2366]: E0213 19:47:06.697653 2366 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"172.31.23.250\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Feb 13 19:47:06.698273 kubelet[2366]: W0213 19:47:06.697806 2366 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 19:47:06.698273 kubelet[2366]: E0213 19:47:06.697837 2366 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 19:47:06.698273 kubelet[2366]: E0213 19:47:06.697956 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3ae78d3621 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.646148641 +0000 UTC m=+1.225129916,LastTimestamp:2025-02-13 19:47:06.646148641 +0000 UTC m=+1.225129916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.702122 kubelet[2366]: I0213 19:47:06.702082 2366 policy_none.go:49] "None policy: Start" Feb 13 19:47:06.704176 kubelet[2366]: I0213 19:47:06.703603 2366 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 19:47:06.704176 kubelet[2366]: I0213 19:47:06.703641 2366 state_mem.go:35] "Initializing new in-memory state store" Feb 13 19:47:06.705248 kubelet[2366]: E0213 19:47:06.705141 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3ae9312b4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.673670989 +0000 UTC m=+1.252652262,LastTimestamp:2025-02-13 19:47:06.673670989 +0000 UTC m=+1.252652262,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.711953 kubelet[2366]: E0213 19:47:06.711526 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aea2084ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node 172.31.23.250 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.689356973 +0000 UTC m=+1.268338244,LastTimestamp:2025-02-13 19:47:06.689356973 +0000 UTC m=+1.268338244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.716429 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 19:47:06.728509 kubelet[2366]: E0213 19:47:06.728394 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aea209f7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node 172.31.23.250 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.689363838 +0000 UTC m=+1.268345109,LastTimestamp:2025-02-13 19:47:06.689363838 +0000 UTC m=+1.268345109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.736455 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 19:47:06.744231 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 19:47:06.745868 kubelet[2366]: E0213 19:47:06.745758 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aea20b167 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node 172.31.23.250 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.689368423 +0000 UTC m=+1.268349694,LastTimestamp:2025-02-13 19:47:06.689368423 +0000 UTC m=+1.268349694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.753109 kubelet[2366]: I0213 19:47:06.753076 2366 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 19:47:06.753352 kubelet[2366]: I0213 19:47:06.753307 2366 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 19:47:06.753831 kubelet[2366]: I0213 19:47:06.753773 2366 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 19:47:06.761257 kubelet[2366]: E0213 19:47:06.761166 2366 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"172.31.23.250\" not found" Feb 13 19:47:06.766458 kubelet[2366]: E0213 19:47:06.766350 2366 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aee3034f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.757494004 +0000 UTC m=+1.336475279,LastTimestamp:2025-02-13 19:47:06.757494004 +0000 UTC m=+1.336475279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.770601 kubelet[2366]: I0213 19:47:06.770538 2366 kubelet_node_status.go:73] "Attempting to register node" node="172.31.23.250" Feb 13 19:47:06.795858 kubelet[2366]: E0213 19:47:06.795453 2366 event.go:359] "Server rejected event (will not retry!)" err="events \"172.31.23.250.1823dc3aea2084ad\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aea2084ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node 172.31.23.250 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.689356973 +0000 UTC m=+1.268338244,LastTimestamp:2025-02-13 19:47:06.770485379 +0000 UTC m=+1.349466647,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.796073 kubelet[2366]: E0213 19:47:06.796046 2366 kubelet_node_status.go:96] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="172.31.23.250" Feb 13 19:47:06.809887 kubelet[2366]: E0213 19:47:06.809511 2366 event.go:359] "Server rejected event (will not retry!)" err="events \"172.31.23.250.1823dc3aea209f7e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{172.31.23.250.1823dc3aea209f7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:172.31.23.250,UID:172.31.23.250,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node 172.31.23.250 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:47:06.689363838 +0000 UTC m=+1.268345109,LastTimestamp:2025-02-13 19:47:06.770500594 +0000 UTC m=+1.349481863,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,}" Feb 13 19:47:06.840353 kubelet[2366]: I0213 19:47:06.840221 2366 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 19:47:06.843313 kubelet[2366]: I0213 19:47:06.843276 2366 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 19:47:06.843313 kubelet[2366]: I0213 19:47:06.843314 2366 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 19:47:06.843448 kubelet[2366]: I0213 19:47:06.843339 2366 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 19:47:06.843448 kubelet[2366]: E0213 19:47:06.843393 2366 kubelet.go:2361] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 13 19:47:06.911336 kubelet[2366]: E0213 19:47:06.911293 2366 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"172.31.23.250\" not found" node="172.31.23.250" Feb 13 19:47:06.997818 kubelet[2366]: I0213 19:47:06.997775 2366 kubelet_node_status.go:73] "Attempting to register node" node="172.31.23.250" Feb 13 19:47:07.040580 kubelet[2366]: I0213 19:47:07.040406 2366 kubelet_node_status.go:76] "Successfully registered node" node="172.31.23.250" Feb 13 19:47:07.137095 kubelet[2366]: E0213 19:47:07.137044 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.237963 kubelet[2366]: E0213 19:47:07.237914 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.338553 kubelet[2366]: E0213 19:47:07.338501 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.428864 sudo[2229]: pam_unix(sudo:session): session closed for user root Feb 13 19:47:07.439118 kubelet[2366]: E0213 19:47:07.439073 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.450968 sshd[2228]: Connection closed by 139.178.89.65 port 52452 Feb 13 19:47:07.452007 sshd-session[2226]: pam_unix(sshd:session): session closed for user core Feb 13 19:47:07.464262 systemd[1]: sshd@8-172.31.23.250:22-139.178.89.65:52452.service: Deactivated successfully. Feb 13 19:47:07.472723 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 19:47:07.475299 systemd-logind[1858]: Session 9 logged out. Waiting for processes to exit. Feb 13 19:47:07.479224 systemd-logind[1858]: Removed session 9. Feb 13 19:47:07.540299 kubelet[2366]: E0213 19:47:07.540243 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.587086 kubelet[2366]: I0213 19:47:07.586937 2366 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 13 19:47:07.587588 kubelet[2366]: W0213 19:47:07.587232 2366 reflector.go:470] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 19:47:07.641480 kubelet[2366]: E0213 19:47:07.641434 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.641480 kubelet[2366]: E0213 19:47:07.641434 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:07.742431 kubelet[2366]: E0213 19:47:07.742296 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.842743 kubelet[2366]: E0213 19:47:07.842675 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:07.943793 kubelet[2366]: E0213 19:47:07.943745 2366 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"172.31.23.250\" not found" Feb 13 19:47:08.044926 kubelet[2366]: I0213 19:47:08.044821 2366 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Feb 13 19:47:08.045354 containerd[1877]: time="2025-02-13T19:47:08.045189372Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 19:47:08.045893 kubelet[2366]: I0213 19:47:08.045866 2366 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Feb 13 19:47:08.642529 kubelet[2366]: E0213 19:47:08.642466 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:08.644036 kubelet[2366]: I0213 19:47:08.643999 2366 apiserver.go:52] "Watching apiserver" Feb 13 19:47:08.651656 kubelet[2366]: I0213 19:47:08.651590 2366 topology_manager.go:215] "Topology Admit Handler" podUID="c05d6fd2-5985-4e1b-a8a3-69c183de4523" podNamespace="calico-system" podName="calico-node-r44cb" Feb 13 19:47:08.651832 kubelet[2366]: I0213 19:47:08.651752 2366 topology_manager.go:215] "Topology Admit Handler" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" podNamespace="calico-system" podName="csi-node-driver-9th5q" Feb 13 19:47:08.651891 kubelet[2366]: I0213 19:47:08.651845 2366 topology_manager.go:215] "Topology Admit Handler" podUID="a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7" podNamespace="kube-system" podName="kube-proxy-pdxdl" Feb 13 19:47:08.652763 kubelet[2366]: E0213 19:47:08.652303 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:08.660277 systemd[1]: Created slice kubepods-besteffort-poda75a8cc7_dbae_4420_bc5f_b36d1b42dbb7.slice - libcontainer container kubepods-besteffort-poda75a8cc7_dbae_4420_bc5f_b36d1b42dbb7.slice. Feb 13 19:47:08.672261 kubelet[2366]: I0213 19:47:08.672231 2366 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 19:47:08.686566 kubelet[2366]: I0213 19:47:08.685308 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-xtables-lock\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.686566 kubelet[2366]: I0213 19:47:08.686466 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c05d6fd2-5985-4e1b-a8a3-69c183de4523-node-certs\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.686566 kubelet[2366]: I0213 19:47:08.686529 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-var-lib-calico\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.694020 kubelet[2366]: I0213 19:47:08.687292 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkwr\" (UniqueName: \"kubernetes.io/projected/c05d6fd2-5985-4e1b-a8a3-69c183de4523-kube-api-access-npkwr\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.694020 kubelet[2366]: I0213 19:47:08.690337 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b75d0e53-a7c0-48db-98b8-b75d74f4d4d5-varrun\") pod \"csi-node-driver-9th5q\" (UID: \"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5\") " pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:08.694020 kubelet[2366]: I0213 19:47:08.692362 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzcjd\" (UniqueName: \"kubernetes.io/projected/a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7-kube-api-access-jzcjd\") pod \"kube-proxy-pdxdl\" (UID: \"a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7\") " pod="kube-system/kube-proxy-pdxdl" Feb 13 19:47:08.695350 kubelet[2366]: I0213 19:47:08.695018 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05d6fd2-5985-4e1b-a8a3-69c183de4523-tigera-ca-bundle\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.695350 kubelet[2366]: I0213 19:47:08.695087 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-var-run-calico\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.695350 kubelet[2366]: I0213 19:47:08.695127 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-cni-log-dir\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.701979 systemd[1]: Created slice kubepods-besteffort-podc05d6fd2_5985_4e1b_a8a3_69c183de4523.slice - libcontainer container kubepods-besteffort-podc05d6fd2_5985_4e1b_a8a3_69c183de4523.slice. Feb 13 19:47:08.705683 kubelet[2366]: I0213 19:47:08.705216 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-flexvol-driver-host\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.705683 kubelet[2366]: I0213 19:47:08.705292 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b75d0e53-a7c0-48db-98b8-b75d74f4d4d5-registration-dir\") pod \"csi-node-driver-9th5q\" (UID: \"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5\") " pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:08.705683 kubelet[2366]: I0213 19:47:08.705317 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-lib-modules\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.705683 kubelet[2366]: I0213 19:47:08.705339 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-cni-net-dir\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.705683 kubelet[2366]: I0213 19:47:08.705388 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b75d0e53-a7c0-48db-98b8-b75d74f4d4d5-socket-dir\") pod \"csi-node-driver-9th5q\" (UID: \"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5\") " pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:08.708783 kubelet[2366]: I0213 19:47:08.705413 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7-xtables-lock\") pod \"kube-proxy-pdxdl\" (UID: \"a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7\") " pod="kube-system/kube-proxy-pdxdl" Feb 13 19:47:08.708783 kubelet[2366]: I0213 19:47:08.705458 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7-lib-modules\") pod \"kube-proxy-pdxdl\" (UID: \"a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7\") " pod="kube-system/kube-proxy-pdxdl" Feb 13 19:47:08.709061 kubelet[2366]: I0213 19:47:08.705498 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-policysync\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.709110 kubelet[2366]: I0213 19:47:08.709060 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c05d6fd2-5985-4e1b-a8a3-69c183de4523-cni-bin-dir\") pod \"calico-node-r44cb\" (UID: \"c05d6fd2-5985-4e1b-a8a3-69c183de4523\") " pod="calico-system/calico-node-r44cb" Feb 13 19:47:08.709164 kubelet[2366]: I0213 19:47:08.709104 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b75d0e53-a7c0-48db-98b8-b75d74f4d4d5-kubelet-dir\") pod \"csi-node-driver-9th5q\" (UID: \"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5\") " pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:08.709164 kubelet[2366]: I0213 19:47:08.709145 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxtg\" (UniqueName: \"kubernetes.io/projected/b75d0e53-a7c0-48db-98b8-b75d74f4d4d5-kube-api-access-2jxtg\") pod \"csi-node-driver-9th5q\" (UID: \"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5\") " pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:08.709242 kubelet[2366]: I0213 19:47:08.709171 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7-kube-proxy\") pod \"kube-proxy-pdxdl\" (UID: \"a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7\") " pod="kube-system/kube-proxy-pdxdl" Feb 13 19:47:08.872472 kubelet[2366]: E0213 19:47:08.872247 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.872472 kubelet[2366]: W0213 19:47:08.872291 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.872472 kubelet[2366]: E0213 19:47:08.872319 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.874016 kubelet[2366]: E0213 19:47:08.873980 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.874386 kubelet[2366]: W0213 19:47:08.874238 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.874386 kubelet[2366]: E0213 19:47:08.874273 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.875992 kubelet[2366]: E0213 19:47:08.875974 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.876280 kubelet[2366]: W0213 19:47:08.876095 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.876280 kubelet[2366]: E0213 19:47:08.876123 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.880238 kubelet[2366]: E0213 19:47:08.880215 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.881326 kubelet[2366]: W0213 19:47:08.880369 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.881326 kubelet[2366]: E0213 19:47:08.880398 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.898811 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.907023 kubelet[2366]: W0213 19:47:08.898845 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.898871 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.902193 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.907023 kubelet[2366]: W0213 19:47:08.902213 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.902249 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.905815 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.907023 kubelet[2366]: W0213 19:47:08.905834 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.907023 kubelet[2366]: E0213 19:47:08.905858 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.912771 kubelet[2366]: E0213 19:47:08.909329 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:08.912771 kubelet[2366]: W0213 19:47:08.909368 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:08.912771 kubelet[2366]: E0213 19:47:08.909393 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:08.975439 containerd[1877]: time="2025-02-13T19:47:08.975397912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pdxdl,Uid:a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7,Namespace:kube-system,Attempt:0,}" Feb 13 19:47:09.027933 containerd[1877]: time="2025-02-13T19:47:09.027891031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r44cb,Uid:c05d6fd2-5985-4e1b-a8a3-69c183de4523,Namespace:calico-system,Attempt:0,}" Feb 13 19:47:09.623440 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3531076766.mount: Deactivated successfully. Feb 13 19:47:09.643852 containerd[1877]: time="2025-02-13T19:47:09.642476774Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:47:09.644419 kubelet[2366]: E0213 19:47:09.642988 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:09.644845 containerd[1877]: time="2025-02-13T19:47:09.644407713Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:47:09.646698 containerd[1877]: time="2025-02-13T19:47:09.646488792Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 19:47:09.648410 containerd[1877]: time="2025-02-13T19:47:09.648358227Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 19:47:09.649679 containerd[1877]: time="2025-02-13T19:47:09.649555739Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:47:09.653762 containerd[1877]: time="2025-02-13T19:47:09.653673421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:47:09.656236 containerd[1877]: time="2025-02-13T19:47:09.655129269Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 626.904814ms" Feb 13 19:47:09.657295 containerd[1877]: time="2025-02-13T19:47:09.657250100Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 681.733254ms" Feb 13 19:47:09.844932 kubelet[2366]: E0213 19:47:09.844457 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.088952563Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.093996410Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.094070645Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.096470823Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.094832055Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.094889003Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.094905609Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:10.096671 containerd[1877]: time="2025-02-13T19:47:10.094997580Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:10.305429 systemd[1]: run-containerd-runc-k8s.io-c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b-runc.l7gi7z.mount: Deactivated successfully. Feb 13 19:47:10.326478 systemd[1]: Started cri-containerd-19a9d43bfcc83540fd9cb8f18001a2f39ef44bae8de8dd3fe6c375559d9b19b6.scope - libcontainer container 19a9d43bfcc83540fd9cb8f18001a2f39ef44bae8de8dd3fe6c375559d9b19b6. Feb 13 19:47:10.330465 systemd[1]: Started cri-containerd-c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b.scope - libcontainer container c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b. Feb 13 19:47:10.407539 containerd[1877]: time="2025-02-13T19:47:10.407499382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pdxdl,Uid:a75a8cc7-dbae-4420-bc5f-b36d1b42dbb7,Namespace:kube-system,Attempt:0,} returns sandbox id \"19a9d43bfcc83540fd9cb8f18001a2f39ef44bae8de8dd3fe6c375559d9b19b6\"" Feb 13 19:47:10.412561 containerd[1877]: time="2025-02-13T19:47:10.412420003Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\"" Feb 13 19:47:10.415774 containerd[1877]: time="2025-02-13T19:47:10.414546080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r44cb,Uid:c05d6fd2-5985-4e1b-a8a3-69c183de4523,Namespace:calico-system,Attempt:0,} returns sandbox id \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\"" Feb 13 19:47:10.643402 kubelet[2366]: E0213 19:47:10.643347 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:11.644378 kubelet[2366]: E0213 19:47:11.644343 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:11.824282 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2674618522.mount: Deactivated successfully. Feb 13 19:47:11.844550 kubelet[2366]: E0213 19:47:11.844300 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:12.513428 containerd[1877]: time="2025-02-13T19:47:12.513350857Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:12.514851 containerd[1877]: time="2025-02-13T19:47:12.514685990Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.10: active requests=0, bytes read=29057858" Feb 13 19:47:12.515868 containerd[1877]: time="2025-02-13T19:47:12.515828678Z" level=info msg="ImageCreate event name:\"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:12.518414 containerd[1877]: time="2025-02-13T19:47:12.518357094Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:12.519505 containerd[1877]: time="2025-02-13T19:47:12.519101676Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.10\" with image id \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\", repo tag \"registry.k8s.io/kube-proxy:v1.30.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\", size \"29056877\" in 2.106240008s" Feb 13 19:47:12.519505 containerd[1877]: time="2025-02-13T19:47:12.519140746Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\" returns image reference \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\"" Feb 13 19:47:12.520910 containerd[1877]: time="2025-02-13T19:47:12.520577295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 19:47:12.527215 containerd[1877]: time="2025-02-13T19:47:12.527171965Z" level=info msg="CreateContainer within sandbox \"19a9d43bfcc83540fd9cb8f18001a2f39ef44bae8de8dd3fe6c375559d9b19b6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 19:47:12.548833 containerd[1877]: time="2025-02-13T19:47:12.548787180Z" level=info msg="CreateContainer within sandbox \"19a9d43bfcc83540fd9cb8f18001a2f39ef44bae8de8dd3fe6c375559d9b19b6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"65eb8727a155bf4f668b0bc5115ad0ab4bc2b29671fcf95647f0c810ad6dff32\"" Feb 13 19:47:12.549594 containerd[1877]: time="2025-02-13T19:47:12.549558678Z" level=info msg="StartContainer for \"65eb8727a155bf4f668b0bc5115ad0ab4bc2b29671fcf95647f0c810ad6dff32\"" Feb 13 19:47:12.615987 systemd[1]: Started cri-containerd-65eb8727a155bf4f668b0bc5115ad0ab4bc2b29671fcf95647f0c810ad6dff32.scope - libcontainer container 65eb8727a155bf4f668b0bc5115ad0ab4bc2b29671fcf95647f0c810ad6dff32. Feb 13 19:47:12.650268 kubelet[2366]: E0213 19:47:12.649463 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:12.684759 containerd[1877]: time="2025-02-13T19:47:12.683593770Z" level=info msg="StartContainer for \"65eb8727a155bf4f668b0bc5115ad0ab4bc2b29671fcf95647f0c810ad6dff32\" returns successfully" Feb 13 19:47:12.981915 kubelet[2366]: I0213 19:47:12.979591 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pdxdl" podStartSLOduration=3.870661583 podStartE2EDuration="5.979565092s" podCreationTimestamp="2025-02-13 19:47:07 +0000 UTC" firstStartedPulling="2025-02-13 19:47:10.411510779 +0000 UTC m=+4.990492036" lastFinishedPulling="2025-02-13 19:47:12.520414279 +0000 UTC m=+7.099395545" observedRunningTime="2025-02-13 19:47:12.975806923 +0000 UTC m=+7.554788196" watchObservedRunningTime="2025-02-13 19:47:12.979565092 +0000 UTC m=+7.558546368" Feb 13 19:47:13.018710 kubelet[2366]: E0213 19:47:13.018673 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.018710 kubelet[2366]: W0213 19:47:13.018706 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.018931 kubelet[2366]: E0213 19:47:13.018841 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.019539 kubelet[2366]: E0213 19:47:13.019422 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.019539 kubelet[2366]: W0213 19:47:13.019444 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.019689 kubelet[2366]: E0213 19:47:13.019547 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.024638 kubelet[2366]: E0213 19:47:13.024590 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.024638 kubelet[2366]: W0213 19:47:13.024768 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.024985 kubelet[2366]: E0213 19:47:13.024896 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.026262 kubelet[2366]: E0213 19:47:13.026086 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.026262 kubelet[2366]: W0213 19:47:13.026107 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.026262 kubelet[2366]: E0213 19:47:13.026129 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.027392 kubelet[2366]: E0213 19:47:13.027087 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.027392 kubelet[2366]: W0213 19:47:13.027107 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.027392 kubelet[2366]: E0213 19:47:13.027126 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.028322 kubelet[2366]: E0213 19:47:13.028063 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.028322 kubelet[2366]: W0213 19:47:13.028088 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.028322 kubelet[2366]: E0213 19:47:13.028108 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.028517 kubelet[2366]: E0213 19:47:13.028343 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.028517 kubelet[2366]: W0213 19:47:13.028353 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.028517 kubelet[2366]: E0213 19:47:13.028365 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.028650 kubelet[2366]: E0213 19:47:13.028591 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.028650 kubelet[2366]: W0213 19:47:13.028601 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.028650 kubelet[2366]: E0213 19:47:13.028613 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.028900 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.031862 kubelet[2366]: W0213 19:47:13.028914 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.028942 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.029164 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.031862 kubelet[2366]: W0213 19:47:13.029174 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.029186 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.029413 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.031862 kubelet[2366]: W0213 19:47:13.029423 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.029435 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.031862 kubelet[2366]: E0213 19:47:13.029652 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.038985 kubelet[2366]: W0213 19:47:13.031691 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.038985 kubelet[2366]: E0213 19:47:13.031711 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.038985 kubelet[2366]: E0213 19:47:13.032998 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.038985 kubelet[2366]: W0213 19:47:13.033015 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.038985 kubelet[2366]: E0213 19:47:13.033032 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.039211 kubelet[2366]: E0213 19:47:13.039147 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.039211 kubelet[2366]: W0213 19:47:13.039170 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.039211 kubelet[2366]: E0213 19:47:13.039198 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.039565 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.040484 kubelet[2366]: W0213 19:47:13.039583 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.039601 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.039874 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.040484 kubelet[2366]: W0213 19:47:13.039885 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.039898 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.040112 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.040484 kubelet[2366]: W0213 19:47:13.040124 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.040135 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.040484 kubelet[2366]: E0213 19:47:13.040352 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.041481 kubelet[2366]: W0213 19:47:13.040362 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.041481 kubelet[2366]: E0213 19:47:13.040375 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.046824 kubelet[2366]: E0213 19:47:13.046170 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.046824 kubelet[2366]: W0213 19:47:13.046597 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.046824 kubelet[2366]: E0213 19:47:13.046632 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.051641 kubelet[2366]: E0213 19:47:13.050753 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.051641 kubelet[2366]: W0213 19:47:13.051269 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.051641 kubelet[2366]: E0213 19:47:13.051601 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.056459 kubelet[2366]: E0213 19:47:13.056202 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.056459 kubelet[2366]: W0213 19:47:13.056241 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.056459 kubelet[2366]: E0213 19:47:13.056268 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.057097 kubelet[2366]: E0213 19:47:13.056669 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.057097 kubelet[2366]: W0213 19:47:13.056955 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.057754 kubelet[2366]: E0213 19:47:13.056987 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.057754 kubelet[2366]: E0213 19:47:13.057688 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.057754 kubelet[2366]: W0213 19:47:13.057710 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.058565 kubelet[2366]: E0213 19:47:13.058488 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.058831 kubelet[2366]: E0213 19:47:13.058619 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.058831 kubelet[2366]: W0213 19:47:13.058629 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.058831 kubelet[2366]: E0213 19:47:13.058785 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.059366 kubelet[2366]: E0213 19:47:13.059249 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.059366 kubelet[2366]: W0213 19:47:13.059265 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.059366 kubelet[2366]: E0213 19:47:13.059297 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.063031 kubelet[2366]: E0213 19:47:13.062782 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.063031 kubelet[2366]: W0213 19:47:13.062805 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.063031 kubelet[2366]: E0213 19:47:13.062851 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.063472 kubelet[2366]: E0213 19:47:13.063435 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.063472 kubelet[2366]: W0213 19:47:13.063453 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.063992 kubelet[2366]: E0213 19:47:13.063649 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.064307 kubelet[2366]: E0213 19:47:13.064293 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.064394 kubelet[2366]: W0213 19:47:13.064380 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.064490 kubelet[2366]: E0213 19:47:13.064471 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.064784 kubelet[2366]: E0213 19:47:13.064772 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.064999 kubelet[2366]: W0213 19:47:13.064855 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.064999 kubelet[2366]: E0213 19:47:13.064876 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.065656 kubelet[2366]: E0213 19:47:13.065562 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.065656 kubelet[2366]: W0213 19:47:13.065577 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.066077 kubelet[2366]: E0213 19:47:13.065803 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.072034 kubelet[2366]: E0213 19:47:13.072000 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.072393 kubelet[2366]: W0213 19:47:13.072188 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.072393 kubelet[2366]: E0213 19:47:13.072228 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.077625 kubelet[2366]: E0213 19:47:13.077485 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.077625 kubelet[2366]: W0213 19:47:13.077514 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.077625 kubelet[2366]: E0213 19:47:13.077548 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.650179 kubelet[2366]: E0213 19:47:13.650125 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:13.844012 kubelet[2366]: E0213 19:47:13.843957 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.960546 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965055 kubelet[2366]: W0213 19:47:13.960574 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.960600 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.960830 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965055 kubelet[2366]: W0213 19:47:13.960839 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.960850 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.961037 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965055 kubelet[2366]: W0213 19:47:13.961046 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.961056 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965055 kubelet[2366]: E0213 19:47:13.961243 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965644 kubelet[2366]: W0213 19:47:13.961251 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961261 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961472 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965644 kubelet[2366]: W0213 19:47:13.961481 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961493 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961669 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965644 kubelet[2366]: W0213 19:47:13.961677 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961687 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.965644 kubelet[2366]: E0213 19:47:13.961886 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.965644 kubelet[2366]: W0213 19:47:13.961895 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.961907 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.963982 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966131 kubelet[2366]: W0213 19:47:13.964005 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.964026 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.964471 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966131 kubelet[2366]: W0213 19:47:13.964482 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.964496 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.964697 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966131 kubelet[2366]: W0213 19:47:13.964706 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966131 kubelet[2366]: E0213 19:47:13.964717 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.964938 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966608 kubelet[2366]: W0213 19:47:13.964947 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.964960 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.965165 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966608 kubelet[2366]: W0213 19:47:13.965173 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.965185 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.965438 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.966608 kubelet[2366]: W0213 19:47:13.965447 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.965460 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.966608 kubelet[2366]: E0213 19:47:13.965666 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967366 kubelet[2366]: W0213 19:47:13.965677 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.965689 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.965911 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967366 kubelet[2366]: W0213 19:47:13.965921 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.965932 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.966178 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967366 kubelet[2366]: W0213 19:47:13.966188 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.966200 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967366 kubelet[2366]: E0213 19:47:13.966444 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967366 kubelet[2366]: W0213 19:47:13.966454 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.966466 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.966669 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967813 kubelet[2366]: W0213 19:47:13.966679 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.966691 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.966973 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967813 kubelet[2366]: W0213 19:47:13.966988 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.967002 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.967207 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.967813 kubelet[2366]: W0213 19:47:13.967217 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.967813 kubelet[2366]: E0213 19:47:13.967228 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.967478 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.968254 kubelet[2366]: W0213 19:47:13.967488 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.967499 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.967785 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.968254 kubelet[2366]: W0213 19:47:13.967794 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.967810 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.968046 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.968254 kubelet[2366]: W0213 19:47:13.968056 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.968254 kubelet[2366]: E0213 19:47:13.968084 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.968629 kubelet[2366]: E0213 19:47:13.968308 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.968629 kubelet[2366]: W0213 19:47:13.968318 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.968629 kubelet[2366]: E0213 19:47:13.968341 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.968629 kubelet[2366]: E0213 19:47:13.968537 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.968629 kubelet[2366]: W0213 19:47:13.968547 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.968629 kubelet[2366]: E0213 19:47:13.968562 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.968833 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.978805 kubelet[2366]: W0213 19:47:13.968843 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.968923 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.969257 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.978805 kubelet[2366]: W0213 19:47:13.969265 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.969280 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.969490 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.978805 kubelet[2366]: W0213 19:47:13.969498 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.969518 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.978805 kubelet[2366]: E0213 19:47:13.977404 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.979110 kubelet[2366]: W0213 19:47:13.977428 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.977474 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.977894 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.979110 kubelet[2366]: W0213 19:47:13.977906 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.977996 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.978475 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.979110 kubelet[2366]: W0213 19:47:13.978494 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.978514 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:13.979110 kubelet[2366]: E0213 19:47:13.978852 2366 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:47:13.979110 kubelet[2366]: W0213 19:47:13.978862 2366 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:47:13.979333 kubelet[2366]: E0213 19:47:13.978875 2366 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:47:14.202211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2025143071.mount: Deactivated successfully. Feb 13 19:47:14.363064 containerd[1877]: time="2025-02-13T19:47:14.360997147Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:14.369156 containerd[1877]: time="2025-02-13T19:47:14.369084723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Feb 13 19:47:14.379764 containerd[1877]: time="2025-02-13T19:47:14.370983539Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:14.398260 containerd[1877]: time="2025-02-13T19:47:14.398196345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:14.401079 containerd[1877]: time="2025-02-13T19:47:14.401026550Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.880410409s" Feb 13 19:47:14.401751 containerd[1877]: time="2025-02-13T19:47:14.401288121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 19:47:14.404577 containerd[1877]: time="2025-02-13T19:47:14.404539985Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 19:47:14.441855 containerd[1877]: time="2025-02-13T19:47:14.441798937Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f\"" Feb 13 19:47:14.442834 containerd[1877]: time="2025-02-13T19:47:14.442799014Z" level=info msg="StartContainer for \"e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f\"" Feb 13 19:47:14.483962 systemd[1]: Started cri-containerd-e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f.scope - libcontainer container e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f. Feb 13 19:47:14.523825 containerd[1877]: time="2025-02-13T19:47:14.522720862Z" level=info msg="StartContainer for \"e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f\" returns successfully" Feb 13 19:47:14.537006 systemd[1]: cri-containerd-e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f.scope: Deactivated successfully. Feb 13 19:47:14.627989 containerd[1877]: time="2025-02-13T19:47:14.627915653Z" level=info msg="shim disconnected" id=e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f namespace=k8s.io Feb 13 19:47:14.627989 containerd[1877]: time="2025-02-13T19:47:14.627980277Z" level=warning msg="cleaning up after shim disconnected" id=e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f namespace=k8s.io Feb 13 19:47:14.627989 containerd[1877]: time="2025-02-13T19:47:14.627992161Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:47:14.650768 kubelet[2366]: E0213 19:47:14.650709 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:14.939303 containerd[1877]: time="2025-02-13T19:47:14.939188889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 19:47:15.131643 systemd[1]: run-containerd-runc-k8s.io-e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f-runc.yb4xZP.mount: Deactivated successfully. Feb 13 19:47:15.131793 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e3029a61b996fc5ead9fb450e3b87d106b151aac51a3e683fe8070480604935f-rootfs.mount: Deactivated successfully. Feb 13 19:47:15.651371 kubelet[2366]: E0213 19:47:15.651330 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:15.844593 kubelet[2366]: E0213 19:47:15.844516 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:16.082577 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 13 19:47:16.652619 kubelet[2366]: E0213 19:47:16.652525 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:17.653321 kubelet[2366]: E0213 19:47:17.653282 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:17.844650 kubelet[2366]: E0213 19:47:17.844546 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:18.655380 kubelet[2366]: E0213 19:47:18.655328 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:19.656405 kubelet[2366]: E0213 19:47:19.656348 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:19.658753 containerd[1877]: time="2025-02-13T19:47:19.658679015Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:19.660446 containerd[1877]: time="2025-02-13T19:47:19.660056520Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 19:47:19.661777 containerd[1877]: time="2025-02-13T19:47:19.661439977Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:19.664594 containerd[1877]: time="2025-02-13T19:47:19.664372066Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:19.665220 containerd[1877]: time="2025-02-13T19:47:19.665181411Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 4.725954199s" Feb 13 19:47:19.665301 containerd[1877]: time="2025-02-13T19:47:19.665226793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 19:47:19.668046 containerd[1877]: time="2025-02-13T19:47:19.667926580Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 19:47:19.687514 containerd[1877]: time="2025-02-13T19:47:19.687381384Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226\"" Feb 13 19:47:19.688460 containerd[1877]: time="2025-02-13T19:47:19.688425736Z" level=info msg="StartContainer for \"d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226\"" Feb 13 19:47:19.772966 systemd[1]: Started cri-containerd-d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226.scope - libcontainer container d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226. Feb 13 19:47:19.825390 containerd[1877]: time="2025-02-13T19:47:19.825337173Z" level=info msg="StartContainer for \"d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226\" returns successfully" Feb 13 19:47:19.844707 kubelet[2366]: E0213 19:47:19.844386 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:20.514750 containerd[1877]: time="2025-02-13T19:47:20.514668143Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 19:47:20.517019 systemd[1]: cri-containerd-d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226.scope: Deactivated successfully. Feb 13 19:47:20.541140 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226-rootfs.mount: Deactivated successfully. Feb 13 19:47:20.586941 kubelet[2366]: I0213 19:47:20.586745 2366 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Feb 13 19:47:20.656859 kubelet[2366]: E0213 19:47:20.656815 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:21.012561 containerd[1877]: time="2025-02-13T19:47:21.012319540Z" level=info msg="shim disconnected" id=d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226 namespace=k8s.io Feb 13 19:47:21.012561 containerd[1877]: time="2025-02-13T19:47:21.012376445Z" level=warning msg="cleaning up after shim disconnected" id=d6bf749c7fea0950a73d6fa7b857b7098c70f18204004b294f211a6a3c00b226 namespace=k8s.io Feb 13 19:47:21.012561 containerd[1877]: time="2025-02-13T19:47:21.012386740Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:47:21.657456 kubelet[2366]: E0213 19:47:21.657401 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:21.849569 systemd[1]: Created slice kubepods-besteffort-podb75d0e53_a7c0_48db_98b8_b75d74f4d4d5.slice - libcontainer container kubepods-besteffort-podb75d0e53_a7c0_48db_98b8_b75d74f4d4d5.slice. Feb 13 19:47:21.853384 containerd[1877]: time="2025-02-13T19:47:21.853343304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:0,}" Feb 13 19:47:21.943360 containerd[1877]: time="2025-02-13T19:47:21.942695003Z" level=error msg="Failed to destroy network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:21.945868 containerd[1877]: time="2025-02-13T19:47:21.943877286Z" level=error msg="encountered an error cleaning up failed sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:21.945868 containerd[1877]: time="2025-02-13T19:47:21.943974529Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:21.945910 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081-shm.mount: Deactivated successfully. Feb 13 19:47:21.946935 kubelet[2366]: E0213 19:47:21.946544 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:21.946935 kubelet[2366]: E0213 19:47:21.946623 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:21.946935 kubelet[2366]: E0213 19:47:21.946653 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:21.947164 kubelet[2366]: E0213 19:47:21.946706 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:21.973948 containerd[1877]: time="2025-02-13T19:47:21.973909066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 19:47:21.974562 kubelet[2366]: I0213 19:47:21.974519 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081" Feb 13 19:47:21.975246 containerd[1877]: time="2025-02-13T19:47:21.975126983Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:21.975404 containerd[1877]: time="2025-02-13T19:47:21.975370081Z" level=info msg="Ensure that sandbox 6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081 in task-service has been cleanup successfully" Feb 13 19:47:21.975972 containerd[1877]: time="2025-02-13T19:47:21.975932578Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:21.975972 containerd[1877]: time="2025-02-13T19:47:21.975959599Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:21.977875 systemd[1]: run-netns-cni\x2df528cd0b\x2d5f88\x2d1d59\x2de9ee\x2dd7d8550e86ae.mount: Deactivated successfully. Feb 13 19:47:21.978434 containerd[1877]: time="2025-02-13T19:47:21.977891399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:1,}" Feb 13 19:47:22.074096 containerd[1877]: time="2025-02-13T19:47:22.074043138Z" level=error msg="Failed to destroy network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.074565 containerd[1877]: time="2025-02-13T19:47:22.074433555Z" level=error msg="encountered an error cleaning up failed sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.074565 containerd[1877]: time="2025-02-13T19:47:22.074504818Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.075290 kubelet[2366]: E0213 19:47:22.074788 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.075290 kubelet[2366]: E0213 19:47:22.074843 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:22.075290 kubelet[2366]: E0213 19:47:22.074864 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:22.075431 kubelet[2366]: E0213 19:47:22.074912 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:22.183875 kubelet[2366]: I0213 19:47:22.183756 2366 topology_manager.go:215] "Topology Admit Handler" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" podNamespace="default" podName="nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:22.190939 systemd[1]: Created slice kubepods-besteffort-pod230ad7c3_5282_4fe6_ba51_061a5c33f268.slice - libcontainer container kubepods-besteffort-pod230ad7c3_5282_4fe6_ba51_061a5c33f268.slice. Feb 13 19:47:22.250451 kubelet[2366]: I0213 19:47:22.250310 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm45k\" (UniqueName: \"kubernetes.io/projected/230ad7c3-5282-4fe6-ba51-061a5c33f268-kube-api-access-dm45k\") pod \"nginx-deployment-85f456d6dd-jk2lg\" (UID: \"230ad7c3-5282-4fe6-ba51-061a5c33f268\") " pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:22.496099 containerd[1877]: time="2025-02-13T19:47:22.495978337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:0,}" Feb 13 19:47:22.658222 kubelet[2366]: E0213 19:47:22.658150 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:22.671236 containerd[1877]: time="2025-02-13T19:47:22.671187257Z" level=error msg="Failed to destroy network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.671837 containerd[1877]: time="2025-02-13T19:47:22.671796013Z" level=error msg="encountered an error cleaning up failed sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.671963 containerd[1877]: time="2025-02-13T19:47:22.671874008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.672171 kubelet[2366]: E0213 19:47:22.672107 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:22.672286 kubelet[2366]: E0213 19:47:22.672174 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:22.672286 kubelet[2366]: E0213 19:47:22.672206 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:22.672286 kubelet[2366]: E0213 19:47:22.672261 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:22.877501 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5-shm.mount: Deactivated successfully. Feb 13 19:47:22.976980 kubelet[2366]: I0213 19:47:22.976861 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579" Feb 13 19:47:22.981952 containerd[1877]: time="2025-02-13T19:47:22.981842352Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:22.982211 containerd[1877]: time="2025-02-13T19:47:22.982184139Z" level=info msg="Ensure that sandbox ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579 in task-service has been cleanup successfully" Feb 13 19:47:22.984284 containerd[1877]: time="2025-02-13T19:47:22.982821392Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:22.984284 containerd[1877]: time="2025-02-13T19:47:22.982846555Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:22.984284 containerd[1877]: time="2025-02-13T19:47:22.983961959Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:1,}" Feb 13 19:47:22.996217 systemd[1]: run-netns-cni\x2d47c5d7df\x2d720f\x2dd2d8\x2d6c13\x2d46d77e158d87.mount: Deactivated successfully. Feb 13 19:47:22.996760 kubelet[2366]: I0213 19:47:22.995763 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5" Feb 13 19:47:22.999262 containerd[1877]: time="2025-02-13T19:47:22.997982728Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:22.999508 containerd[1877]: time="2025-02-13T19:47:22.999482705Z" level=info msg="Ensure that sandbox 7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5 in task-service has been cleanup successfully" Feb 13 19:47:23.002836 containerd[1877]: time="2025-02-13T19:47:23.002793494Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:23.002989 containerd[1877]: time="2025-02-13T19:47:23.002833931Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:23.009430 containerd[1877]: time="2025-02-13T19:47:23.009206144Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:23.009430 containerd[1877]: time="2025-02-13T19:47:23.009329353Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:23.009430 containerd[1877]: time="2025-02-13T19:47:23.009345339Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:23.010462 containerd[1877]: time="2025-02-13T19:47:23.010428593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:2,}" Feb 13 19:47:23.011130 systemd[1]: run-netns-cni\x2dd49f1e47\x2d9dec\x2dbf5d\x2db027\x2d214c1a4c6a96.mount: Deactivated successfully. Feb 13 19:47:23.211948 containerd[1877]: time="2025-02-13T19:47:23.211896019Z" level=error msg="Failed to destroy network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.213090 containerd[1877]: time="2025-02-13T19:47:23.213019357Z" level=error msg="encountered an error cleaning up failed sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.213325 containerd[1877]: time="2025-02-13T19:47:23.213104924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.213823 kubelet[2366]: E0213 19:47:23.213684 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.213823 kubelet[2366]: E0213 19:47:23.213791 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:23.213967 kubelet[2366]: E0213 19:47:23.213835 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:23.213967 kubelet[2366]: E0213 19:47:23.213891 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:23.217632 containerd[1877]: time="2025-02-13T19:47:23.217581868Z" level=error msg="Failed to destroy network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.217953 containerd[1877]: time="2025-02-13T19:47:23.217924638Z" level=error msg="encountered an error cleaning up failed sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.218067 containerd[1877]: time="2025-02-13T19:47:23.217994211Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.218407 kubelet[2366]: E0213 19:47:23.218235 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:23.218407 kubelet[2366]: E0213 19:47:23.218283 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:23.218407 kubelet[2366]: E0213 19:47:23.218302 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:23.218543 kubelet[2366]: E0213 19:47:23.218349 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:23.658949 kubelet[2366]: E0213 19:47:23.658882 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:23.886005 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6-shm.mount: Deactivated successfully. Feb 13 19:47:24.000711 kubelet[2366]: I0213 19:47:23.999719 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6" Feb 13 19:47:24.000979 containerd[1877]: time="2025-02-13T19:47:24.000885205Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:24.001179 containerd[1877]: time="2025-02-13T19:47:24.001127898Z" level=info msg="Ensure that sandbox 045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6 in task-service has been cleanup successfully" Feb 13 19:47:24.005764 containerd[1877]: time="2025-02-13T19:47:24.003820149Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:24.005764 containerd[1877]: time="2025-02-13T19:47:24.003859679Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:24.006521 containerd[1877]: time="2025-02-13T19:47:24.006477863Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:24.006620 containerd[1877]: time="2025-02-13T19:47:24.006600575Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:24.006667 containerd[1877]: time="2025-02-13T19:47:24.006622516Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:24.009742 containerd[1877]: time="2025-02-13T19:47:24.007790602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:2,}" Feb 13 19:47:24.009292 systemd[1]: run-netns-cni\x2d2fd71269\x2d14d2\x2d2dbc\x2d1071\x2d738b86d25fa3.mount: Deactivated successfully. Feb 13 19:47:24.029868 kubelet[2366]: I0213 19:47:24.028397 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43" Feb 13 19:47:24.029997 containerd[1877]: time="2025-02-13T19:47:24.029639133Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:24.030276 containerd[1877]: time="2025-02-13T19:47:24.030254402Z" level=info msg="Ensure that sandbox bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43 in task-service has been cleanup successfully" Feb 13 19:47:24.035939 containerd[1877]: time="2025-02-13T19:47:24.035883258Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:24.036103 containerd[1877]: time="2025-02-13T19:47:24.036087209Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:24.039768 containerd[1877]: time="2025-02-13T19:47:24.038267868Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:24.038939 systemd[1]: run-netns-cni\x2d6617cead\x2d33eb\x2df60b\x2d640a\x2d388a56197777.mount: Deactivated successfully. Feb 13 19:47:24.044316 containerd[1877]: time="2025-02-13T19:47:24.043061488Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:24.044316 containerd[1877]: time="2025-02-13T19:47:24.043704017Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:24.058046 containerd[1877]: time="2025-02-13T19:47:24.055487566Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:24.058046 containerd[1877]: time="2025-02-13T19:47:24.056365409Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:24.058046 containerd[1877]: time="2025-02-13T19:47:24.056391080Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:24.062186 containerd[1877]: time="2025-02-13T19:47:24.061055573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:3,}" Feb 13 19:47:24.218229 containerd[1877]: time="2025-02-13T19:47:24.218161365Z" level=error msg="Failed to destroy network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.221913 containerd[1877]: time="2025-02-13T19:47:24.221867428Z" level=error msg="encountered an error cleaning up failed sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.222138 containerd[1877]: time="2025-02-13T19:47:24.222092780Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.222752 kubelet[2366]: E0213 19:47:24.222530 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.222752 kubelet[2366]: E0213 19:47:24.222595 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:24.222752 kubelet[2366]: E0213 19:47:24.222621 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:24.222941 kubelet[2366]: E0213 19:47:24.222672 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:24.270170 containerd[1877]: time="2025-02-13T19:47:24.270018494Z" level=error msg="Failed to destroy network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.270423 containerd[1877]: time="2025-02-13T19:47:24.270382632Z" level=error msg="encountered an error cleaning up failed sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.270500 containerd[1877]: time="2025-02-13T19:47:24.270473110Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.271853 kubelet[2366]: E0213 19:47:24.270821 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:24.271853 kubelet[2366]: E0213 19:47:24.270886 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:24.271853 kubelet[2366]: E0213 19:47:24.270914 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:24.272060 kubelet[2366]: E0213 19:47:24.270967 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:24.660458 kubelet[2366]: E0213 19:47:24.660176 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:24.879478 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74-shm.mount: Deactivated successfully. Feb 13 19:47:25.036604 kubelet[2366]: I0213 19:47:25.036199 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74" Feb 13 19:47:25.038629 containerd[1877]: time="2025-02-13T19:47:25.038125577Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:25.038629 containerd[1877]: time="2025-02-13T19:47:25.038492770Z" level=info msg="Ensure that sandbox e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74 in task-service has been cleanup successfully" Feb 13 19:47:25.039208 containerd[1877]: time="2025-02-13T19:47:25.039184411Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:25.039310 containerd[1877]: time="2025-02-13T19:47:25.039294296Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:25.042006 systemd[1]: run-netns-cni\x2d82616209\x2d50e6\x2d94f5\x2ddaeb\x2d2d6ead657b20.mount: Deactivated successfully. Feb 13 19:47:25.044590 containerd[1877]: time="2025-02-13T19:47:25.044186365Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:25.044590 containerd[1877]: time="2025-02-13T19:47:25.044300226Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:25.044590 containerd[1877]: time="2025-02-13T19:47:25.044315245Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:25.047046 containerd[1877]: time="2025-02-13T19:47:25.047017810Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:25.047871 containerd[1877]: time="2025-02-13T19:47:25.047846286Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:25.047994 containerd[1877]: time="2025-02-13T19:47:25.047976928Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:25.049184 containerd[1877]: time="2025-02-13T19:47:25.049157017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:3,}" Feb 13 19:47:25.050784 kubelet[2366]: I0213 19:47:25.050635 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083" Feb 13 19:47:25.051989 containerd[1877]: time="2025-02-13T19:47:25.051960741Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:25.052704 containerd[1877]: time="2025-02-13T19:47:25.052502671Z" level=info msg="Ensure that sandbox fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083 in task-service has been cleanup successfully" Feb 13 19:47:25.053945 containerd[1877]: time="2025-02-13T19:47:25.053921726Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:25.054157 containerd[1877]: time="2025-02-13T19:47:25.054071400Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:25.055096 containerd[1877]: time="2025-02-13T19:47:25.055058319Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:25.055714 containerd[1877]: time="2025-02-13T19:47:25.055300905Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:25.055714 containerd[1877]: time="2025-02-13T19:47:25.055320563Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:25.057685 containerd[1877]: time="2025-02-13T19:47:25.057642350Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:25.057877 containerd[1877]: time="2025-02-13T19:47:25.057859357Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:25.057995 containerd[1877]: time="2025-02-13T19:47:25.057975658Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:25.058353 systemd[1]: run-netns-cni\x2d1569c079\x2d8e2d\x2d989c\x2d5c51\x2def1a4391df6b.mount: Deactivated successfully. Feb 13 19:47:25.059248 containerd[1877]: time="2025-02-13T19:47:25.059224957Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:25.059938 containerd[1877]: time="2025-02-13T19:47:25.059853352Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:25.059938 containerd[1877]: time="2025-02-13T19:47:25.059874357Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:25.062012 containerd[1877]: time="2025-02-13T19:47:25.061906704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:4,}" Feb 13 19:47:25.272449 containerd[1877]: time="2025-02-13T19:47:25.272334952Z" level=error msg="Failed to destroy network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.273295 containerd[1877]: time="2025-02-13T19:47:25.272510176Z" level=error msg="Failed to destroy network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.273295 containerd[1877]: time="2025-02-13T19:47:25.272986382Z" level=error msg="encountered an error cleaning up failed sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.273295 containerd[1877]: time="2025-02-13T19:47:25.273091881Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.273438 kubelet[2366]: E0213 19:47:25.273380 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.273497 kubelet[2366]: E0213 19:47:25.273447 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:25.273497 kubelet[2366]: E0213 19:47:25.273474 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:25.273589 kubelet[2366]: E0213 19:47:25.273528 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:25.274262 containerd[1877]: time="2025-02-13T19:47:25.273958364Z" level=error msg="encountered an error cleaning up failed sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.274262 containerd[1877]: time="2025-02-13T19:47:25.274164914Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.274540 kubelet[2366]: E0213 19:47:25.274346 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:25.274540 kubelet[2366]: E0213 19:47:25.274394 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:25.274540 kubelet[2366]: E0213 19:47:25.274417 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:25.274684 kubelet[2366]: E0213 19:47:25.274464 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:25.661207 kubelet[2366]: E0213 19:47:25.661137 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:25.876101 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2-shm.mount: Deactivated successfully. Feb 13 19:47:26.087823 kubelet[2366]: I0213 19:47:26.087522 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84" Feb 13 19:47:26.094269 containerd[1877]: time="2025-02-13T19:47:26.093841151Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:26.094269 containerd[1877]: time="2025-02-13T19:47:26.094121700Z" level=info msg="Ensure that sandbox 32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84 in task-service has been cleanup successfully" Feb 13 19:47:26.094889 containerd[1877]: time="2025-02-13T19:47:26.094858889Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:26.095022 containerd[1877]: time="2025-02-13T19:47:26.095004258Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:26.099360 systemd[1]: run-netns-cni\x2dd9a71f1a\x2d541a\x2dbbe4\x2d850e\x2d91aa8018feba.mount: Deactivated successfully. Feb 13 19:47:26.100008 containerd[1877]: time="2025-02-13T19:47:26.099378770Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:26.100008 containerd[1877]: time="2025-02-13T19:47:26.099481885Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:26.100008 containerd[1877]: time="2025-02-13T19:47:26.099496911Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:26.102860 containerd[1877]: time="2025-02-13T19:47:26.101457046Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:26.102860 containerd[1877]: time="2025-02-13T19:47:26.101561707Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:26.102860 containerd[1877]: time="2025-02-13T19:47:26.101577499Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:26.103193 kubelet[2366]: I0213 19:47:26.103066 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2" Feb 13 19:47:26.104482 containerd[1877]: time="2025-02-13T19:47:26.104450170Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:26.105039 containerd[1877]: time="2025-02-13T19:47:26.105009485Z" level=info msg="Ensure that sandbox a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2 in task-service has been cleanup successfully" Feb 13 19:47:26.105465 containerd[1877]: time="2025-02-13T19:47:26.105371583Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:26.105465 containerd[1877]: time="2025-02-13T19:47:26.105394592Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:26.105465 containerd[1877]: time="2025-02-13T19:47:26.104456096Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:26.105628 containerd[1877]: time="2025-02-13T19:47:26.105518422Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:26.105628 containerd[1877]: time="2025-02-13T19:47:26.105532679Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:26.106128 containerd[1877]: time="2025-02-13T19:47:26.105892739Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:26.106128 containerd[1877]: time="2025-02-13T19:47:26.105990726Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:26.106128 containerd[1877]: time="2025-02-13T19:47:26.106005957Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:26.106128 containerd[1877]: time="2025-02-13T19:47:26.106070592Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:26.106696 containerd[1877]: time="2025-02-13T19:47:26.106142825Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:26.106696 containerd[1877]: time="2025-02-13T19:47:26.106155465Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:26.117260 containerd[1877]: time="2025-02-13T19:47:26.117211008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:5,}" Feb 13 19:47:26.120673 containerd[1877]: time="2025-02-13T19:47:26.119108959Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:26.120673 containerd[1877]: time="2025-02-13T19:47:26.119260532Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:26.120673 containerd[1877]: time="2025-02-13T19:47:26.119279294Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:26.121750 containerd[1877]: time="2025-02-13T19:47:26.121081112Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:26.121750 containerd[1877]: time="2025-02-13T19:47:26.121191969Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:26.121750 containerd[1877]: time="2025-02-13T19:47:26.121207880Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:26.121479 systemd[1]: run-netns-cni\x2d4a290898\x2d4878\x2d8d0a\x2d3718\x2dc2d8f723954b.mount: Deactivated successfully. Feb 13 19:47:26.122945 containerd[1877]: time="2025-02-13T19:47:26.122449557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:4,}" Feb 13 19:47:26.334906 containerd[1877]: time="2025-02-13T19:47:26.334669876Z" level=error msg="Failed to destroy network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.336680 containerd[1877]: time="2025-02-13T19:47:26.336440653Z" level=error msg="encountered an error cleaning up failed sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.336680 containerd[1877]: time="2025-02-13T19:47:26.336543548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.337370 kubelet[2366]: E0213 19:47:26.337185 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.337370 kubelet[2366]: E0213 19:47:26.337268 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:26.337370 kubelet[2366]: E0213 19:47:26.337297 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:26.337603 kubelet[2366]: E0213 19:47:26.337456 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:26.347916 containerd[1877]: time="2025-02-13T19:47:26.347202995Z" level=error msg="Failed to destroy network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.350345 containerd[1877]: time="2025-02-13T19:47:26.350018464Z" level=error msg="encountered an error cleaning up failed sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.350345 containerd[1877]: time="2025-02-13T19:47:26.350146572Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.350923 kubelet[2366]: E0213 19:47:26.350702 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:26.350923 kubelet[2366]: E0213 19:47:26.350780 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:26.350923 kubelet[2366]: E0213 19:47:26.350810 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:26.351141 kubelet[2366]: E0213 19:47:26.350859 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:26.637035 kubelet[2366]: E0213 19:47:26.636890 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:26.661504 kubelet[2366]: E0213 19:47:26.661404 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:26.875610 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5-shm.mount: Deactivated successfully. Feb 13 19:47:26.875774 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6-shm.mount: Deactivated successfully. Feb 13 19:47:27.118831 kubelet[2366]: I0213 19:47:27.118139 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6" Feb 13 19:47:27.120952 containerd[1877]: time="2025-02-13T19:47:27.120917788Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:27.123075 containerd[1877]: time="2025-02-13T19:47:27.123035142Z" level=info msg="Ensure that sandbox bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6 in task-service has been cleanup successfully" Feb 13 19:47:27.124937 containerd[1877]: time="2025-02-13T19:47:27.124908701Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:27.125276 containerd[1877]: time="2025-02-13T19:47:27.125062811Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:27.128750 containerd[1877]: time="2025-02-13T19:47:27.128092396Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:27.131506 systemd[1]: run-netns-cni\x2d18914f95\x2dac15\x2dd4cc\x2d8245\x2d152368ab5336.mount: Deactivated successfully. Feb 13 19:47:27.134289 containerd[1877]: time="2025-02-13T19:47:27.134248888Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:27.134289 containerd[1877]: time="2025-02-13T19:47:27.134275311Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:27.137617 containerd[1877]: time="2025-02-13T19:47:27.135485428Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:27.137617 containerd[1877]: time="2025-02-13T19:47:27.135588436Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:27.137617 containerd[1877]: time="2025-02-13T19:47:27.135647619Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:27.137901 kubelet[2366]: I0213 19:47:27.135490 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5" Feb 13 19:47:27.138020 containerd[1877]: time="2025-02-13T19:47:27.137930779Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:27.138118 containerd[1877]: time="2025-02-13T19:47:27.138093869Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:27.138270 containerd[1877]: time="2025-02-13T19:47:27.138118984Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:27.138330 containerd[1877]: time="2025-02-13T19:47:27.138270776Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:27.141608 containerd[1877]: time="2025-02-13T19:47:27.139142858Z" level=info msg="Ensure that sandbox f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5 in task-service has been cleanup successfully" Feb 13 19:47:27.141683 systemd[1]: run-netns-cni\x2d5f116d86\x2dd3a5\x2dec1d\x2d4b29\x2dfeb6deae2554.mount: Deactivated successfully. Feb 13 19:47:27.143451 containerd[1877]: time="2025-02-13T19:47:27.142573269Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:27.143451 containerd[1877]: time="2025-02-13T19:47:27.142811373Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:27.144905 containerd[1877]: time="2025-02-13T19:47:27.144875245Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:27.145018 containerd[1877]: time="2025-02-13T19:47:27.144993478Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:27.145096 containerd[1877]: time="2025-02-13T19:47:27.145015808Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:27.145146 containerd[1877]: time="2025-02-13T19:47:27.145112964Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:27.145207 containerd[1877]: time="2025-02-13T19:47:27.145191504Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:27.145256 containerd[1877]: time="2025-02-13T19:47:27.145207605Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:27.145756 containerd[1877]: time="2025-02-13T19:47:27.145708712Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:27.145864 containerd[1877]: time="2025-02-13T19:47:27.145824991Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:27.145864 containerd[1877]: time="2025-02-13T19:47:27.145842933Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:27.146043 containerd[1877]: time="2025-02-13T19:47:27.145916571Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:27.146099 containerd[1877]: time="2025-02-13T19:47:27.146073997Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:27.146099 containerd[1877]: time="2025-02-13T19:47:27.146090189Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:27.148039 containerd[1877]: time="2025-02-13T19:47:27.147804062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:6,}" Feb 13 19:47:27.149526 containerd[1877]: time="2025-02-13T19:47:27.149495648Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:27.149625 containerd[1877]: time="2025-02-13T19:47:27.149607348Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:27.149678 containerd[1877]: time="2025-02-13T19:47:27.149622874Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:27.150113 containerd[1877]: time="2025-02-13T19:47:27.150035544Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:27.150178 containerd[1877]: time="2025-02-13T19:47:27.150127572Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:27.150178 containerd[1877]: time="2025-02-13T19:47:27.150142336Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:27.152015 containerd[1877]: time="2025-02-13T19:47:27.151986966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:5,}" Feb 13 19:47:27.344870 containerd[1877]: time="2025-02-13T19:47:27.344633791Z" level=error msg="Failed to destroy network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.346550 containerd[1877]: time="2025-02-13T19:47:27.346363355Z" level=error msg="encountered an error cleaning up failed sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.346550 containerd[1877]: time="2025-02-13T19:47:27.346466992Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.348116 kubelet[2366]: E0213 19:47:27.347616 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.348116 kubelet[2366]: E0213 19:47:27.347683 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:27.348116 kubelet[2366]: E0213 19:47:27.347711 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:27.348326 kubelet[2366]: E0213 19:47:27.347869 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:27.369899 containerd[1877]: time="2025-02-13T19:47:27.369687725Z" level=error msg="Failed to destroy network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.370768 containerd[1877]: time="2025-02-13T19:47:27.370394982Z" level=error msg="encountered an error cleaning up failed sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.370768 containerd[1877]: time="2025-02-13T19:47:27.370496743Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.373085 kubelet[2366]: E0213 19:47:27.372540 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:27.373085 kubelet[2366]: E0213 19:47:27.372624 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:27.373085 kubelet[2366]: E0213 19:47:27.372654 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:27.373303 kubelet[2366]: E0213 19:47:27.373032 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:27.662820 kubelet[2366]: E0213 19:47:27.662567 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:27.880886 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207-shm.mount: Deactivated successfully. Feb 13 19:47:28.144049 kubelet[2366]: I0213 19:47:28.144016 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207" Feb 13 19:47:28.146842 containerd[1877]: time="2025-02-13T19:47:28.146397012Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:28.147096 containerd[1877]: time="2025-02-13T19:47:28.147064224Z" level=info msg="Ensure that sandbox 3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207 in task-service has been cleanup successfully" Feb 13 19:47:28.148776 containerd[1877]: time="2025-02-13T19:47:28.147566617Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:28.148776 containerd[1877]: time="2025-02-13T19:47:28.147592389Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:28.149112 systemd[1]: run-netns-cni\x2d7ddcaea8\x2d40c0\x2df2be\x2dda43\x2d6a921fed6c50.mount: Deactivated successfully. Feb 13 19:47:28.152710 containerd[1877]: time="2025-02-13T19:47:28.152674798Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:28.152954 containerd[1877]: time="2025-02-13T19:47:28.152931653Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:28.153010 containerd[1877]: time="2025-02-13T19:47:28.152956106Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:28.154199 containerd[1877]: time="2025-02-13T19:47:28.154036930Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:28.155634 kubelet[2366]: I0213 19:47:28.154946 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896" Feb 13 19:47:28.155723 containerd[1877]: time="2025-02-13T19:47:28.154947377Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:28.155723 containerd[1877]: time="2025-02-13T19:47:28.154969998Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:28.155723 containerd[1877]: time="2025-02-13T19:47:28.155651452Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:47:28.156058 containerd[1877]: time="2025-02-13T19:47:28.155988089Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:28.156134 containerd[1877]: time="2025-02-13T19:47:28.156081281Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:28.156134 containerd[1877]: time="2025-02-13T19:47:28.156096693Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:28.156634 containerd[1877]: time="2025-02-13T19:47:28.156608163Z" level=info msg="Ensure that sandbox 4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896 in task-service has been cleanup successfully" Feb 13 19:47:28.157048 containerd[1877]: time="2025-02-13T19:47:28.157022661Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:47:28.157272 containerd[1877]: time="2025-02-13T19:47:28.157049299Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:47:28.157272 containerd[1877]: time="2025-02-13T19:47:28.157198032Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:28.157805 containerd[1877]: time="2025-02-13T19:47:28.157287621Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:28.157805 containerd[1877]: time="2025-02-13T19:47:28.157302384Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:28.157805 containerd[1877]: time="2025-02-13T19:47:28.157416812Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:28.157805 containerd[1877]: time="2025-02-13T19:47:28.157586867Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:28.157805 containerd[1877]: time="2025-02-13T19:47:28.157605907Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:28.160302 containerd[1877]: time="2025-02-13T19:47:28.160260911Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:28.160379 containerd[1877]: time="2025-02-13T19:47:28.160363056Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:28.160439 containerd[1877]: time="2025-02-13T19:47:28.160378739Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:28.160486 containerd[1877]: time="2025-02-13T19:47:28.160462555Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:28.160828 containerd[1877]: time="2025-02-13T19:47:28.160537812Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:28.160828 containerd[1877]: time="2025-02-13T19:47:28.160553845Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:28.162499 systemd[1]: run-netns-cni\x2de1d9330e\x2d10c5\x2d884a\x2df1d1\x2d53fd4f1bf80e.mount: Deactivated successfully. Feb 13 19:47:28.163804 containerd[1877]: time="2025-02-13T19:47:28.163321046Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:28.163804 containerd[1877]: time="2025-02-13T19:47:28.163434367Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:28.163804 containerd[1877]: time="2025-02-13T19:47:28.163450340Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:28.163804 containerd[1877]: time="2025-02-13T19:47:28.163637717Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:28.166160 containerd[1877]: time="2025-02-13T19:47:28.163726980Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:28.166266 containerd[1877]: time="2025-02-13T19:47:28.166160672Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:28.166455 containerd[1877]: time="2025-02-13T19:47:28.166429143Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:28.166554 containerd[1877]: time="2025-02-13T19:47:28.166531676Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:28.166605 containerd[1877]: time="2025-02-13T19:47:28.166550709Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:28.168168 containerd[1877]: time="2025-02-13T19:47:28.167789183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:7,}" Feb 13 19:47:28.168168 containerd[1877]: time="2025-02-13T19:47:28.168154157Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:28.168290 containerd[1877]: time="2025-02-13T19:47:28.168250189Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:28.168290 containerd[1877]: time="2025-02-13T19:47:28.168265864Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:28.169767 containerd[1877]: time="2025-02-13T19:47:28.169149275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:6,}" Feb 13 19:47:28.354190 containerd[1877]: time="2025-02-13T19:47:28.354136348Z" level=error msg="Failed to destroy network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.356011 containerd[1877]: time="2025-02-13T19:47:28.355965315Z" level=error msg="encountered an error cleaning up failed sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.356167 containerd[1877]: time="2025-02-13T19:47:28.356055398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.356867 kubelet[2366]: E0213 19:47:28.356629 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.356867 kubelet[2366]: E0213 19:47:28.356695 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:28.356867 kubelet[2366]: E0213 19:47:28.356794 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:28.357305 kubelet[2366]: E0213 19:47:28.356857 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:28.467165 containerd[1877]: time="2025-02-13T19:47:28.466852027Z" level=error msg="Failed to destroy network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.468221 containerd[1877]: time="2025-02-13T19:47:28.468097495Z" level=error msg="encountered an error cleaning up failed sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.468476 containerd[1877]: time="2025-02-13T19:47:28.468402938Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.469771 kubelet[2366]: E0213 19:47:28.469530 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:28.469771 kubelet[2366]: E0213 19:47:28.469701 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:28.470481 kubelet[2366]: E0213 19:47:28.470300 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:28.471168 kubelet[2366]: E0213 19:47:28.470674 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:28.662977 kubelet[2366]: E0213 19:47:28.662851 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:28.877593 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97-shm.mount: Deactivated successfully. Feb 13 19:47:28.878185 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9-shm.mount: Deactivated successfully. Feb 13 19:47:29.163034 kubelet[2366]: I0213 19:47:29.162814 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9" Feb 13 19:47:29.164413 containerd[1877]: time="2025-02-13T19:47:29.164012176Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:47:29.164413 containerd[1877]: time="2025-02-13T19:47:29.164255023Z" level=info msg="Ensure that sandbox be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9 in task-service has been cleanup successfully" Feb 13 19:47:29.165831 containerd[1877]: time="2025-02-13T19:47:29.165803487Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:47:29.166750 containerd[1877]: time="2025-02-13T19:47:29.166098349Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:47:29.167680 systemd[1]: run-netns-cni\x2d69adde36\x2d883a\x2d053d\x2d66aa\x2dd2533658e8ae.mount: Deactivated successfully. Feb 13 19:47:29.169789 containerd[1877]: time="2025-02-13T19:47:29.169069888Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:29.169789 containerd[1877]: time="2025-02-13T19:47:29.169179962Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:29.169789 containerd[1877]: time="2025-02-13T19:47:29.169196773Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:29.170469 containerd[1877]: time="2025-02-13T19:47:29.170084090Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:29.170469 containerd[1877]: time="2025-02-13T19:47:29.170186910Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:29.170469 containerd[1877]: time="2025-02-13T19:47:29.170204544Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:29.170925 containerd[1877]: time="2025-02-13T19:47:29.170779632Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:29.170925 containerd[1877]: time="2025-02-13T19:47:29.170879583Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:29.170925 containerd[1877]: time="2025-02-13T19:47:29.170895912Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:29.172452 containerd[1877]: time="2025-02-13T19:47:29.172401827Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:29.172515 containerd[1877]: time="2025-02-13T19:47:29.172495447Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:29.172562 containerd[1877]: time="2025-02-13T19:47:29.172510932Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:29.172997 containerd[1877]: time="2025-02-13T19:47:29.172884807Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:29.172997 containerd[1877]: time="2025-02-13T19:47:29.172970999Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:29.172997 containerd[1877]: time="2025-02-13T19:47:29.172984296Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:29.173787 containerd[1877]: time="2025-02-13T19:47:29.173670106Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:29.174105 containerd[1877]: time="2025-02-13T19:47:29.174026396Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:29.174105 containerd[1877]: time="2025-02-13T19:47:29.174047023Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:29.174646 containerd[1877]: time="2025-02-13T19:47:29.174576804Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:29.174725 containerd[1877]: time="2025-02-13T19:47:29.174681493Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:29.174725 containerd[1877]: time="2025-02-13T19:47:29.174698640Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:29.176380 containerd[1877]: time="2025-02-13T19:47:29.176347209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:8,}" Feb 13 19:47:29.182766 kubelet[2366]: I0213 19:47:29.182641 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97" Feb 13 19:47:29.188225 containerd[1877]: time="2025-02-13T19:47:29.188185433Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:47:29.188426 containerd[1877]: time="2025-02-13T19:47:29.188401928Z" level=info msg="Ensure that sandbox d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97 in task-service has been cleanup successfully" Feb 13 19:47:29.189020 containerd[1877]: time="2025-02-13T19:47:29.188862879Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:47:29.189020 containerd[1877]: time="2025-02-13T19:47:29.188890161Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:47:29.191617 containerd[1877]: time="2025-02-13T19:47:29.191510151Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:47:29.195095 containerd[1877]: time="2025-02-13T19:47:29.191621292Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:47:29.195095 containerd[1877]: time="2025-02-13T19:47:29.191636413Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:47:29.192837 systemd[1]: run-netns-cni\x2d8c1ab323\x2deb77\x2d29fc\x2d16d5\x2d23f04fae7ba7.mount: Deactivated successfully. Feb 13 19:47:29.198180 containerd[1877]: time="2025-02-13T19:47:29.196822285Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:29.198180 containerd[1877]: time="2025-02-13T19:47:29.197440743Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:29.198180 containerd[1877]: time="2025-02-13T19:47:29.197463489Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:29.198369 containerd[1877]: time="2025-02-13T19:47:29.198293143Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:29.198415 containerd[1877]: time="2025-02-13T19:47:29.198390292Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:29.198415 containerd[1877]: time="2025-02-13T19:47:29.198404518Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:29.199819 containerd[1877]: time="2025-02-13T19:47:29.199530709Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:29.204759 containerd[1877]: time="2025-02-13T19:47:29.202047193Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:29.204759 containerd[1877]: time="2025-02-13T19:47:29.202077697Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:29.205373 containerd[1877]: time="2025-02-13T19:47:29.205343527Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:29.205509 containerd[1877]: time="2025-02-13T19:47:29.205450891Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:29.205509 containerd[1877]: time="2025-02-13T19:47:29.205473412Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:29.207965 containerd[1877]: time="2025-02-13T19:47:29.207935683Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:29.208069 containerd[1877]: time="2025-02-13T19:47:29.208049528Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:29.208142 containerd[1877]: time="2025-02-13T19:47:29.208071934Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:29.210461 containerd[1877]: time="2025-02-13T19:47:29.208880585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:7,}" Feb 13 19:47:29.429868 containerd[1877]: time="2025-02-13T19:47:29.425985552Z" level=error msg="Failed to destroy network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.433826 containerd[1877]: time="2025-02-13T19:47:29.433770963Z" level=error msg="encountered an error cleaning up failed sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.434041 containerd[1877]: time="2025-02-13T19:47:29.434013712Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.434395 kubelet[2366]: E0213 19:47:29.434359 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.434846 kubelet[2366]: E0213 19:47:29.434818 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:29.434978 kubelet[2366]: E0213 19:47:29.434958 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:29.435840 kubelet[2366]: E0213 19:47:29.435108 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:29.471546 containerd[1877]: time="2025-02-13T19:47:29.471418298Z" level=error msg="Failed to destroy network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.472958 containerd[1877]: time="2025-02-13T19:47:29.472646308Z" level=error msg="encountered an error cleaning up failed sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.472958 containerd[1877]: time="2025-02-13T19:47:29.472832137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.473784 kubelet[2366]: E0213 19:47:29.473373 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:29.473784 kubelet[2366]: E0213 19:47:29.473439 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:29.473784 kubelet[2366]: E0213 19:47:29.473463 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:29.474579 kubelet[2366]: E0213 19:47:29.474392 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:29.664102 kubelet[2366]: E0213 19:47:29.664059 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:29.880087 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf-shm.mount: Deactivated successfully. Feb 13 19:47:30.202745 kubelet[2366]: I0213 19:47:30.201578 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4" Feb 13 19:47:30.204662 containerd[1877]: time="2025-02-13T19:47:30.204603239Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:47:30.205179 containerd[1877]: time="2025-02-13T19:47:30.205065097Z" level=info msg="Ensure that sandbox a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4 in task-service has been cleanup successfully" Feb 13 19:47:30.205513 containerd[1877]: time="2025-02-13T19:47:30.205472180Z" level=info msg="TearDown network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" successfully" Feb 13 19:47:30.205637 containerd[1877]: time="2025-02-13T19:47:30.205619222Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" returns successfully" Feb 13 19:47:30.214617 containerd[1877]: time="2025-02-13T19:47:30.208887253Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:47:30.214617 containerd[1877]: time="2025-02-13T19:47:30.209012753Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:47:30.214617 containerd[1877]: time="2025-02-13T19:47:30.213490988Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:47:30.214147 systemd[1]: run-netns-cni\x2d24be7ba9\x2de9e1\x2ddec5\x2de70d\x2d7a4451dc5ea0.mount: Deactivated successfully. Feb 13 19:47:30.238261 containerd[1877]: time="2025-02-13T19:47:30.237231290Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:47:30.238261 containerd[1877]: time="2025-02-13T19:47:30.237348735Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:47:30.238261 containerd[1877]: time="2025-02-13T19:47:30.237411033Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:47:30.239301 containerd[1877]: time="2025-02-13T19:47:30.239016397Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:30.239301 containerd[1877]: time="2025-02-13T19:47:30.239123509Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:30.239301 containerd[1877]: time="2025-02-13T19:47:30.239140998Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:30.241190 containerd[1877]: time="2025-02-13T19:47:30.240614402Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:30.241190 containerd[1877]: time="2025-02-13T19:47:30.240718251Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:30.241190 containerd[1877]: time="2025-02-13T19:47:30.240753428Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:30.242215 containerd[1877]: time="2025-02-13T19:47:30.241953172Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:30.242363 containerd[1877]: time="2025-02-13T19:47:30.242335590Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:30.242416 containerd[1877]: time="2025-02-13T19:47:30.242361135Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:30.243510 containerd[1877]: time="2025-02-13T19:47:30.242898340Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:30.243510 containerd[1877]: time="2025-02-13T19:47:30.243060551Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:30.243510 containerd[1877]: time="2025-02-13T19:47:30.243078941Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:30.243950 containerd[1877]: time="2025-02-13T19:47:30.243927052Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:30.244115 containerd[1877]: time="2025-02-13T19:47:30.244097475Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:30.244257 containerd[1877]: time="2025-02-13T19:47:30.244239809Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:30.245225 containerd[1877]: time="2025-02-13T19:47:30.245198133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:8,}" Feb 13 19:47:30.256471 kubelet[2366]: I0213 19:47:30.256409 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf" Feb 13 19:47:30.258958 containerd[1877]: time="2025-02-13T19:47:30.258918094Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:47:30.259185 containerd[1877]: time="2025-02-13T19:47:30.259150337Z" level=info msg="Ensure that sandbox 7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf in task-service has been cleanup successfully" Feb 13 19:47:30.260092 containerd[1877]: time="2025-02-13T19:47:30.260055455Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:47:30.260092 containerd[1877]: time="2025-02-13T19:47:30.260083384Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:47:30.264804 containerd[1877]: time="2025-02-13T19:47:30.264150109Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:47:30.264804 containerd[1877]: time="2025-02-13T19:47:30.264336990Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:47:30.264804 containerd[1877]: time="2025-02-13T19:47:30.264356537Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:47:30.268685 systemd[1]: run-netns-cni\x2d03b7e4a5\x2dfa29\x2d120a\x2dec57\x2d286cd8307041.mount: Deactivated successfully. Feb 13 19:47:30.286413 containerd[1877]: time="2025-02-13T19:47:30.286372976Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:30.296090 containerd[1877]: time="2025-02-13T19:47:30.296045084Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:30.300917 containerd[1877]: time="2025-02-13T19:47:30.300443147Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:30.303921 containerd[1877]: time="2025-02-13T19:47:30.303862216Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:30.304094 containerd[1877]: time="2025-02-13T19:47:30.304000506Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:30.304094 containerd[1877]: time="2025-02-13T19:47:30.304016491Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:30.310636 containerd[1877]: time="2025-02-13T19:47:30.310513074Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:30.310919 containerd[1877]: time="2025-02-13T19:47:30.310642832Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:30.310919 containerd[1877]: time="2025-02-13T19:47:30.310697550Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:30.326784 containerd[1877]: time="2025-02-13T19:47:30.325540585Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:30.326784 containerd[1877]: time="2025-02-13T19:47:30.325660803Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:30.326784 containerd[1877]: time="2025-02-13T19:47:30.325676657Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:30.333763 containerd[1877]: time="2025-02-13T19:47:30.333456953Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:30.333763 containerd[1877]: time="2025-02-13T19:47:30.333580531Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:30.333763 containerd[1877]: time="2025-02-13T19:47:30.333596824Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:30.337549 containerd[1877]: time="2025-02-13T19:47:30.337323799Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:30.337549 containerd[1877]: time="2025-02-13T19:47:30.337439945Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:30.337549 containerd[1877]: time="2025-02-13T19:47:30.337456352Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:30.338364 containerd[1877]: time="2025-02-13T19:47:30.338304451Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:30.338595 containerd[1877]: time="2025-02-13T19:47:30.338574213Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:30.338700 containerd[1877]: time="2025-02-13T19:47:30.338683384Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:30.340015 containerd[1877]: time="2025-02-13T19:47:30.339894928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:9,}" Feb 13 19:47:30.549025 containerd[1877]: time="2025-02-13T19:47:30.548810504Z" level=error msg="Failed to destroy network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.550771 containerd[1877]: time="2025-02-13T19:47:30.550004548Z" level=error msg="encountered an error cleaning up failed sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.550771 containerd[1877]: time="2025-02-13T19:47:30.550093614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:8,} failed, error" error="failed to setup network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.550968 kubelet[2366]: E0213 19:47:30.550528 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.550968 kubelet[2366]: E0213 19:47:30.550595 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:30.550968 kubelet[2366]: E0213 19:47:30.550622 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:30.551256 kubelet[2366]: E0213 19:47:30.550679 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:30.575559 containerd[1877]: time="2025-02-13T19:47:30.574870828Z" level=error msg="Failed to destroy network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.575559 containerd[1877]: time="2025-02-13T19:47:30.575359191Z" level=error msg="encountered an error cleaning up failed sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.575559 containerd[1877]: time="2025-02-13T19:47:30.575440005Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.575824 kubelet[2366]: E0213 19:47:30.575682 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:30.577360 kubelet[2366]: E0213 19:47:30.576938 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:30.577360 kubelet[2366]: E0213 19:47:30.576988 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:30.577360 kubelet[2366]: E0213 19:47:30.577065 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:30.665028 kubelet[2366]: E0213 19:47:30.664960 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:30.714571 update_engine[1861]: I20250213 19:47:30.714453 1861 update_attempter.cc:509] Updating boot flags... Feb 13 19:47:30.862867 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 38 scanned by (udev-worker) (3488) Feb 13 19:47:30.881060 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353-shm.mount: Deactivated successfully. Feb 13 19:47:31.281773 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 38 scanned by (udev-worker) (3492) Feb 13 19:47:31.288678 kubelet[2366]: I0213 19:47:31.288646 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91" Feb 13 19:47:31.290867 containerd[1877]: time="2025-02-13T19:47:31.289881300Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:47:31.290867 containerd[1877]: time="2025-02-13T19:47:31.290146138Z" level=info msg="Ensure that sandbox 4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91 in task-service has been cleanup successfully" Feb 13 19:47:31.290867 containerd[1877]: time="2025-02-13T19:47:31.290574091Z" level=info msg="TearDown network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" successfully" Feb 13 19:47:31.290867 containerd[1877]: time="2025-02-13T19:47:31.290619732Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" returns successfully" Feb 13 19:47:31.294358 systemd[1]: run-netns-cni\x2d54bf84e4\x2dafd8\x2d76ac\x2d9779\x2d8a56f0c72cf4.mount: Deactivated successfully. Feb 13 19:47:31.296966 containerd[1877]: time="2025-02-13T19:47:31.295977893Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:47:31.296966 containerd[1877]: time="2025-02-13T19:47:31.296110155Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:47:31.296966 containerd[1877]: time="2025-02-13T19:47:31.296129610Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:47:31.298923 containerd[1877]: time="2025-02-13T19:47:31.298466383Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:47:31.298923 containerd[1877]: time="2025-02-13T19:47:31.298581535Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:47:31.298923 containerd[1877]: time="2025-02-13T19:47:31.298596716Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:47:31.300183 containerd[1877]: time="2025-02-13T19:47:31.299759082Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:31.300183 containerd[1877]: time="2025-02-13T19:47:31.299859116Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:31.300183 containerd[1877]: time="2025-02-13T19:47:31.299875206Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:31.302429 containerd[1877]: time="2025-02-13T19:47:31.302049544Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:31.302429 containerd[1877]: time="2025-02-13T19:47:31.302150722Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:31.302429 containerd[1877]: time="2025-02-13T19:47:31.302165175Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:31.304205 containerd[1877]: time="2025-02-13T19:47:31.303968313Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:31.304205 containerd[1877]: time="2025-02-13T19:47:31.304064701Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:31.304205 containerd[1877]: time="2025-02-13T19:47:31.304123916Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:31.305457 containerd[1877]: time="2025-02-13T19:47:31.305102662Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:31.305457 containerd[1877]: time="2025-02-13T19:47:31.305216185Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:31.305457 containerd[1877]: time="2025-02-13T19:47:31.305231867Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:31.306908 containerd[1877]: time="2025-02-13T19:47:31.306647855Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:31.307784 containerd[1877]: time="2025-02-13T19:47:31.307461052Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:31.307784 containerd[1877]: time="2025-02-13T19:47:31.307484612Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:31.309562 containerd[1877]: time="2025-02-13T19:47:31.309305051Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:31.309562 containerd[1877]: time="2025-02-13T19:47:31.309406991Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:31.309562 containerd[1877]: time="2025-02-13T19:47:31.309421218Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:31.312271 containerd[1877]: time="2025-02-13T19:47:31.311965882Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:31.312647 containerd[1877]: time="2025-02-13T19:47:31.312624367Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:31.313142 containerd[1877]: time="2025-02-13T19:47:31.313120689Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:31.314106 kubelet[2366]: I0213 19:47:31.313601 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353" Feb 13 19:47:31.316256 containerd[1877]: time="2025-02-13T19:47:31.315428788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:10,}" Feb 13 19:47:31.319816 containerd[1877]: time="2025-02-13T19:47:31.317886472Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" Feb 13 19:47:31.319816 containerd[1877]: time="2025-02-13T19:47:31.318160931Z" level=info msg="Ensure that sandbox cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353 in task-service has been cleanup successfully" Feb 13 19:47:31.321160 systemd[1]: run-netns-cni\x2d20bc7005\x2dc955\x2d5651\x2db0b0\x2ddf940f206652.mount: Deactivated successfully. Feb 13 19:47:31.322073 containerd[1877]: time="2025-02-13T19:47:31.322026101Z" level=info msg="TearDown network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" successfully" Feb 13 19:47:31.322542 containerd[1877]: time="2025-02-13T19:47:31.322514924Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" returns successfully" Feb 13 19:47:31.327855 containerd[1877]: time="2025-02-13T19:47:31.327822356Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:47:31.328853 containerd[1877]: time="2025-02-13T19:47:31.328700083Z" level=info msg="TearDown network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" successfully" Feb 13 19:47:31.331138 containerd[1877]: time="2025-02-13T19:47:31.330650603Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" returns successfully" Feb 13 19:47:31.333295 containerd[1877]: time="2025-02-13T19:47:31.331637050Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:47:31.333295 containerd[1877]: time="2025-02-13T19:47:31.331720696Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:47:31.333295 containerd[1877]: time="2025-02-13T19:47:31.331745976Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:47:31.334229 containerd[1877]: time="2025-02-13T19:47:31.334204968Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:47:31.336025 containerd[1877]: time="2025-02-13T19:47:31.335987488Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:47:31.336581 containerd[1877]: time="2025-02-13T19:47:31.336557647Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:47:31.339157 containerd[1877]: time="2025-02-13T19:47:31.339128897Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:31.339602 containerd[1877]: time="2025-02-13T19:47:31.339576431Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:31.340724 containerd[1877]: time="2025-02-13T19:47:31.340632050Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:31.341953 containerd[1877]: time="2025-02-13T19:47:31.341927023Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:31.342468 containerd[1877]: time="2025-02-13T19:47:31.342148257Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:31.342468 containerd[1877]: time="2025-02-13T19:47:31.342170454Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:31.344599 containerd[1877]: time="2025-02-13T19:47:31.344569936Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:31.347207 containerd[1877]: time="2025-02-13T19:47:31.345871137Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:31.347207 containerd[1877]: time="2025-02-13T19:47:31.345899514Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:31.348122 containerd[1877]: time="2025-02-13T19:47:31.347892030Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:31.348122 containerd[1877]: time="2025-02-13T19:47:31.348080597Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:31.348122 containerd[1877]: time="2025-02-13T19:47:31.348097302Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:31.352045 containerd[1877]: time="2025-02-13T19:47:31.348689696Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:31.352045 containerd[1877]: time="2025-02-13T19:47:31.349013893Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:31.352045 containerd[1877]: time="2025-02-13T19:47:31.349032008Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:31.352045 containerd[1877]: time="2025-02-13T19:47:31.351371160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:9,}" Feb 13 19:47:31.596760 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 38 scanned by (udev-worker) (3492) Feb 13 19:47:31.658785 containerd[1877]: time="2025-02-13T19:47:31.658701591Z" level=error msg="Failed to destroy network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.659300 containerd[1877]: time="2025-02-13T19:47:31.659142807Z" level=error msg="encountered an error cleaning up failed sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.659300 containerd[1877]: time="2025-02-13T19:47:31.659220252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:10,} failed, error" error="failed to setup network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.661902 kubelet[2366]: E0213 19:47:31.659459 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.661902 kubelet[2366]: E0213 19:47:31.659525 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:31.661902 kubelet[2366]: E0213 19:47:31.659553 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:31.662118 kubelet[2366]: E0213 19:47:31.659606 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:31.666859 kubelet[2366]: E0213 19:47:31.666817 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:31.718633 containerd[1877]: time="2025-02-13T19:47:31.718582496Z" level=error msg="Failed to destroy network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.719002 containerd[1877]: time="2025-02-13T19:47:31.718963030Z" level=error msg="encountered an error cleaning up failed sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.719086 containerd[1877]: time="2025-02-13T19:47:31.719046352Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:9,} failed, error" error="failed to setup network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.719370 kubelet[2366]: E0213 19:47:31.719332 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:31.719459 kubelet[2366]: E0213 19:47:31.719415 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:31.719516 kubelet[2366]: E0213 19:47:31.719467 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-jk2lg" Feb 13 19:47:31.719588 kubelet[2366]: E0213 19:47:31.719559 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-jk2lg_default(230ad7c3-5282-4fe6-ba51-061a5c33f268)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-jk2lg" podUID="230ad7c3-5282-4fe6-ba51-061a5c33f268" Feb 13 19:47:31.851922 containerd[1877]: time="2025-02-13T19:47:31.851875032Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:31.852754 containerd[1877]: time="2025-02-13T19:47:31.852689893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 19:47:31.855327 containerd[1877]: time="2025-02-13T19:47:31.854485893Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:31.860359 containerd[1877]: time="2025-02-13T19:47:31.859160021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:31.860359 containerd[1877]: time="2025-02-13T19:47:31.859848081Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 9.885894689s" Feb 13 19:47:31.860359 containerd[1877]: time="2025-02-13T19:47:31.859875701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 19:47:31.868207 containerd[1877]: time="2025-02-13T19:47:31.868166322Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 19:47:31.880250 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a-shm.mount: Deactivated successfully. Feb 13 19:47:31.880348 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3672994524.mount: Deactivated successfully. Feb 13 19:47:31.890969 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2019240190.mount: Deactivated successfully. Feb 13 19:47:31.899384 containerd[1877]: time="2025-02-13T19:47:31.899331979Z" level=info msg="CreateContainer within sandbox \"c41487c001c0685ee25a0077a5e6dbc4586a54f232c04e500432faa04090c57b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05\"" Feb 13 19:47:31.900178 containerd[1877]: time="2025-02-13T19:47:31.900132604Z" level=info msg="StartContainer for \"ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05\"" Feb 13 19:47:32.020263 systemd[1]: Started cri-containerd-ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05.scope - libcontainer container ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05. Feb 13 19:47:32.118682 containerd[1877]: time="2025-02-13T19:47:32.118520872Z" level=info msg="StartContainer for \"ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05\" returns successfully" Feb 13 19:47:32.302067 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 19:47:32.302172 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 19:47:32.319761 kubelet[2366]: I0213 19:47:32.319642 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607" Feb 13 19:47:32.322220 containerd[1877]: time="2025-02-13T19:47:32.321802731Z" level=info msg="StopPodSandbox for \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\"" Feb 13 19:47:32.322486 containerd[1877]: time="2025-02-13T19:47:32.322284851Z" level=info msg="Ensure that sandbox c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607 in task-service has been cleanup successfully" Feb 13 19:47:32.322983 containerd[1877]: time="2025-02-13T19:47:32.322849906Z" level=info msg="TearDown network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" successfully" Feb 13 19:47:32.322983 containerd[1877]: time="2025-02-13T19:47:32.322872801Z" level=info msg="StopPodSandbox for \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" returns successfully" Feb 13 19:47:32.323418 containerd[1877]: time="2025-02-13T19:47:32.323341576Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" Feb 13 19:47:32.323661 containerd[1877]: time="2025-02-13T19:47:32.323564756Z" level=info msg="TearDown network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" successfully" Feb 13 19:47:32.323661 containerd[1877]: time="2025-02-13T19:47:32.323596136Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" returns successfully" Feb 13 19:47:32.323975 containerd[1877]: time="2025-02-13T19:47:32.323944986Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:47:32.324058 containerd[1877]: time="2025-02-13T19:47:32.324040954Z" level=info msg="TearDown network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" successfully" Feb 13 19:47:32.324150 containerd[1877]: time="2025-02-13T19:47:32.324059131Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" returns successfully" Feb 13 19:47:32.324651 containerd[1877]: time="2025-02-13T19:47:32.324627183Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:47:32.326057 containerd[1877]: time="2025-02-13T19:47:32.325969227Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:47:32.326057 containerd[1877]: time="2025-02-13T19:47:32.325992932Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:47:32.328590 containerd[1877]: time="2025-02-13T19:47:32.328552139Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:47:32.328717 containerd[1877]: time="2025-02-13T19:47:32.328656845Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:47:32.328717 containerd[1877]: time="2025-02-13T19:47:32.328674545Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:47:32.329476 containerd[1877]: time="2025-02-13T19:47:32.329280644Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:47:32.329560 containerd[1877]: time="2025-02-13T19:47:32.329540172Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:47:32.329645 containerd[1877]: time="2025-02-13T19:47:32.329559932Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:47:32.330252 containerd[1877]: time="2025-02-13T19:47:32.330220014Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:47:32.330336 containerd[1877]: time="2025-02-13T19:47:32.330305321Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:47:32.330336 containerd[1877]: time="2025-02-13T19:47:32.330320571Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:47:32.331353 containerd[1877]: time="2025-02-13T19:47:32.331321024Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:47:32.331441 containerd[1877]: time="2025-02-13T19:47:32.331407230Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:47:32.331441 containerd[1877]: time="2025-02-13T19:47:32.331423201Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:47:32.332270 containerd[1877]: time="2025-02-13T19:47:32.332051648Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:47:32.332270 containerd[1877]: time="2025-02-13T19:47:32.332139066Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:47:32.332270 containerd[1877]: time="2025-02-13T19:47:32.332154435Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:47:32.333273 containerd[1877]: time="2025-02-13T19:47:32.333211355Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:47:32.333619 containerd[1877]: time="2025-02-13T19:47:32.333305043Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:47:32.333619 containerd[1877]: time="2025-02-13T19:47:32.333321319Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:47:32.334394 containerd[1877]: time="2025-02-13T19:47:32.334274477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:10,}" Feb 13 19:47:32.338143 kubelet[2366]: I0213 19:47:32.337975 2366 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a" Feb 13 19:47:32.338939 containerd[1877]: time="2025-02-13T19:47:32.338617452Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" Feb 13 19:47:32.339433 containerd[1877]: time="2025-02-13T19:47:32.339370890Z" level=info msg="Ensure that sandbox 94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a in task-service has been cleanup successfully" Feb 13 19:47:32.340826 containerd[1877]: time="2025-02-13T19:47:32.340624720Z" level=info msg="TearDown network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" successfully" Feb 13 19:47:32.340826 containerd[1877]: time="2025-02-13T19:47:32.340645786Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" returns successfully" Feb 13 19:47:32.341266 containerd[1877]: time="2025-02-13T19:47:32.341007116Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:47:32.341266 containerd[1877]: time="2025-02-13T19:47:32.341174333Z" level=info msg="TearDown network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" successfully" Feb 13 19:47:32.341266 containerd[1877]: time="2025-02-13T19:47:32.341190249Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" returns successfully" Feb 13 19:47:32.341891 containerd[1877]: time="2025-02-13T19:47:32.341518499Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:47:32.341891 containerd[1877]: time="2025-02-13T19:47:32.341631585Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:47:32.341891 containerd[1877]: time="2025-02-13T19:47:32.341647498Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:47:32.342063 containerd[1877]: time="2025-02-13T19:47:32.341934792Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:47:32.342063 containerd[1877]: time="2025-02-13T19:47:32.342051157Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:47:32.342328 containerd[1877]: time="2025-02-13T19:47:32.342066088Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.342825935Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.343010360Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.343026797Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.343389822Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.343500824Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:32.343751 containerd[1877]: time="2025-02-13T19:47:32.343676670Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:32.344676 containerd[1877]: time="2025-02-13T19:47:32.344127084Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:32.344676 containerd[1877]: time="2025-02-13T19:47:32.344279500Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:32.344676 containerd[1877]: time="2025-02-13T19:47:32.344296541Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:32.344676 containerd[1877]: time="2025-02-13T19:47:32.344661049Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:32.344924 containerd[1877]: time="2025-02-13T19:47:32.344805009Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:32.344924 containerd[1877]: time="2025-02-13T19:47:32.344821362Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:32.345301 containerd[1877]: time="2025-02-13T19:47:32.345237084Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:32.345367 containerd[1877]: time="2025-02-13T19:47:32.345315634Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:32.345367 containerd[1877]: time="2025-02-13T19:47:32.345329637Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:32.345704 containerd[1877]: time="2025-02-13T19:47:32.345678921Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:32.345949 containerd[1877]: time="2025-02-13T19:47:32.345928426Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:32.346106 containerd[1877]: time="2025-02-13T19:47:32.345950111Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:32.346346 containerd[1877]: time="2025-02-13T19:47:32.346299335Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:32.346407 containerd[1877]: time="2025-02-13T19:47:32.346388285Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:32.346455 containerd[1877]: time="2025-02-13T19:47:32.346403809Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:32.351493 containerd[1877]: time="2025-02-13T19:47:32.349168819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:11,}" Feb 13 19:47:32.667562 kubelet[2366]: E0213 19:47:32.667514 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:32.822119 kubelet[2366]: I0213 19:47:32.821987 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r44cb" podStartSLOduration=4.378486496 podStartE2EDuration="25.821963832s" podCreationTimestamp="2025-02-13 19:47:07 +0000 UTC" firstStartedPulling="2025-02-13 19:47:10.417801424 +0000 UTC m=+4.996782682" lastFinishedPulling="2025-02-13 19:47:31.861278763 +0000 UTC m=+26.440260018" observedRunningTime="2025-02-13 19:47:32.377576566 +0000 UTC m=+26.956557840" watchObservedRunningTime="2025-02-13 19:47:32.821963832 +0000 UTC m=+27.400945107" Feb 13 19:47:32.890172 systemd[1]: run-netns-cni\x2d847595e0\x2d6b70\x2d9ecf\x2d4ba1\x2d691226f2b0ba.mount: Deactivated successfully. Feb 13 19:47:32.890434 systemd[1]: run-netns-cni\x2dad6775f8\x2da75f\x2ddee0\x2d4eb1\x2d941fad4dda8b.mount: Deactivated successfully. Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.823 [INFO][3908] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.824 [INFO][3908] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" iface="eth0" netns="/var/run/netns/cni-583139aa-15b6-0004-596a-bfee0795fb95" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.824 [INFO][3908] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" iface="eth0" netns="/var/run/netns/cni-583139aa-15b6-0004-596a-bfee0795fb95" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.833 [INFO][3908] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" iface="eth0" netns="/var/run/netns/cni-583139aa-15b6-0004-596a-bfee0795fb95" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.833 [INFO][3908] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:32.833 [INFO][3908] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.031 [INFO][3926] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" HandleID="k8s-pod-network.bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.032 [INFO][3926] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.032 [INFO][3926] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.054 [WARNING][3926] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" HandleID="k8s-pod-network.bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.054 [INFO][3926] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" HandleID="k8s-pod-network.bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.057 [INFO][3926] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:47:33.060682 containerd[1877]: 2025-02-13 19:47:33.059 [INFO][3908] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285" Feb 13 19:47:33.068090 systemd[1]: run-netns-cni\x2d583139aa\x2d15b6\x2d0004\x2d596a\x2dbfee0795fb95.mount: Deactivated successfully. Feb 13 19:47:33.069155 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285-shm.mount: Deactivated successfully. Feb 13 19:47:33.069678 containerd[1877]: time="2025-02-13T19:47:33.069528644Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:11,} failed, error" error="failed to setup network for sandbox \"bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:33.072974 kubelet[2366]: E0213 19:47:33.072919 2366 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:47:33.074629 kubelet[2366]: E0213 19:47:33.073519 2366 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:33.074629 kubelet[2366]: E0213 19:47:33.073570 2366 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9th5q" Feb 13 19:47:33.074629 kubelet[2366]: E0213 19:47:33.073628 2366 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9th5q_calico-system(b75d0e53-a7c0-48db-98b8-b75d74f4d4d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc4c8296ba99b914694dea17f73aa6d14cfa272f2d6da0d82c3f3cd7994d1285\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9th5q" podUID="b75d0e53-a7c0-48db-98b8-b75d74f4d4d5" Feb 13 19:47:33.210846 (udev-worker)[3490]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:47:33.214702 systemd-networkd[1725]: cali95e4e41534f: Link UP Feb 13 19:47:33.215839 systemd-networkd[1725]: cali95e4e41534f: Gained carrier Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:32.491 [INFO][3873] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:32.762 [INFO][3873] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0 nginx-deployment-85f456d6dd- default 230ad7c3-5282-4fe6-ba51-061a5c33f268 1079 0 2025-02-13 19:47:22 +0000 UTC map[app:nginx pod-template-hash:85f456d6dd projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 172.31.23.250 nginx-deployment-85f456d6dd-jk2lg eth0 default [] [] [kns.default ksa.default.default] cali95e4e41534f [] []}} ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:32.762 [INFO][3873] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.022 [INFO][3925] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" HandleID="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Workload="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.062 [INFO][3925] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" HandleID="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Workload="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00010b9c0), Attrs:map[string]string{"namespace":"default", "node":"172.31.23.250", "pod":"nginx-deployment-85f456d6dd-jk2lg", "timestamp":"2025-02-13 19:47:33.02276812 +0000 UTC"}, Hostname:"172.31.23.250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.063 [INFO][3925] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.067 [INFO][3925] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.068 [INFO][3925] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.23.250' Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.077 [INFO][3925] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.117 [INFO][3925] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.133 [INFO][3925] ipam/ipam.go 489: Trying affinity for 192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.140 [INFO][3925] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.149 [INFO][3925] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.149 [INFO][3925] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.0/26 handle="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.161 [INFO][3925] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9 Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.188 [INFO][3925] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.0/26 handle="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.198 [INFO][3925] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.1/26] block=192.168.50.0/26 handle="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.198 [INFO][3925] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.1/26] handle="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" host="172.31.23.250" Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.198 [INFO][3925] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:47:33.249498 containerd[1877]: 2025-02-13 19:47:33.198 [INFO][3925] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.1/26] IPv6=[] ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" HandleID="k8s-pod-network.6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Workload="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.200 [INFO][3873] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0", GenerateName:"nginx-deployment-85f456d6dd-", Namespace:"default", SelfLink:"", UID:"230ad7c3-5282-4fe6-ba51-061a5c33f268", ResourceVersion:"1079", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"85f456d6dd", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"", Pod:"nginx-deployment-85f456d6dd-jk2lg", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.50.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali95e4e41534f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.201 [INFO][3873] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.1/32] ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.201 [INFO][3873] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali95e4e41534f ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.216 [INFO][3873] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.216 [INFO][3873] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0", GenerateName:"nginx-deployment-85f456d6dd-", Namespace:"default", SelfLink:"", UID:"230ad7c3-5282-4fe6-ba51-061a5c33f268", ResourceVersion:"1079", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"85f456d6dd", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9", Pod:"nginx-deployment-85f456d6dd-jk2lg", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.50.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali95e4e41534f", MAC:"0a:3f:b1:93:1a:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:33.252642 containerd[1877]: 2025-02-13 19:47:33.247 [INFO][3873] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9" Namespace="default" Pod="nginx-deployment-85f456d6dd-jk2lg" WorkloadEndpoint="172.31.23.250-k8s-nginx--deployment--85f456d6dd--jk2lg-eth0" Feb 13 19:47:33.289232 containerd[1877]: time="2025-02-13T19:47:33.288977430Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:47:33.289232 containerd[1877]: time="2025-02-13T19:47:33.289135858Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:47:33.289232 containerd[1877]: time="2025-02-13T19:47:33.289198689Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:33.290019 containerd[1877]: time="2025-02-13T19:47:33.289949559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:33.321962 systemd[1]: Started cri-containerd-6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9.scope - libcontainer container 6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9. Feb 13 19:47:33.343850 containerd[1877]: time="2025-02-13T19:47:33.343813244Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" Feb 13 19:47:33.344120 containerd[1877]: time="2025-02-13T19:47:33.344097601Z" level=info msg="TearDown network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" successfully" Feb 13 19:47:33.344343 containerd[1877]: time="2025-02-13T19:47:33.344206775Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" returns successfully" Feb 13 19:47:33.346242 containerd[1877]: time="2025-02-13T19:47:33.346218453Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:47:33.346674 containerd[1877]: time="2025-02-13T19:47:33.346551986Z" level=info msg="TearDown network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" successfully" Feb 13 19:47:33.346674 containerd[1877]: time="2025-02-13T19:47:33.346574101Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" returns successfully" Feb 13 19:47:33.349767 containerd[1877]: time="2025-02-13T19:47:33.349390999Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:47:33.349767 containerd[1877]: time="2025-02-13T19:47:33.349511399Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:47:33.349767 containerd[1877]: time="2025-02-13T19:47:33.349529712Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:47:33.350655 containerd[1877]: time="2025-02-13T19:47:33.350632858Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:47:33.351027 containerd[1877]: time="2025-02-13T19:47:33.350990294Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:47:33.351147 containerd[1877]: time="2025-02-13T19:47:33.351130104Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:47:33.352502 containerd[1877]: time="2025-02-13T19:47:33.352182264Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:47:33.353264 containerd[1877]: time="2025-02-13T19:47:33.353105674Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:47:33.353264 containerd[1877]: time="2025-02-13T19:47:33.353151671Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:47:33.354481 containerd[1877]: time="2025-02-13T19:47:33.354362819Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:47:33.354997 containerd[1877]: time="2025-02-13T19:47:33.354882870Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:47:33.355135 containerd[1877]: time="2025-02-13T19:47:33.355100122Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:47:33.356622 containerd[1877]: time="2025-02-13T19:47:33.356581521Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:47:33.357060 containerd[1877]: time="2025-02-13T19:47:33.357011061Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:47:33.357060 containerd[1877]: time="2025-02-13T19:47:33.357031874Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:47:33.358004 containerd[1877]: time="2025-02-13T19:47:33.357548691Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:47:33.358004 containerd[1877]: time="2025-02-13T19:47:33.357708650Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:47:33.358004 containerd[1877]: time="2025-02-13T19:47:33.357725693Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:47:33.358557 containerd[1877]: time="2025-02-13T19:47:33.358517562Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:47:33.358820 containerd[1877]: time="2025-02-13T19:47:33.358710925Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:47:33.358924 containerd[1877]: time="2025-02-13T19:47:33.358905653Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:47:33.359465 containerd[1877]: time="2025-02-13T19:47:33.359446794Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:47:33.359684 containerd[1877]: time="2025-02-13T19:47:33.359640431Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:47:33.359684 containerd[1877]: time="2025-02-13T19:47:33.359659567Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:47:33.360754 containerd[1877]: time="2025-02-13T19:47:33.360479787Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:47:33.360754 containerd[1877]: time="2025-02-13T19:47:33.360568413Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:47:33.360754 containerd[1877]: time="2025-02-13T19:47:33.360610857Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:47:33.361701 containerd[1877]: time="2025-02-13T19:47:33.361672031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:11,}" Feb 13 19:47:33.455046 containerd[1877]: time="2025-02-13T19:47:33.454998232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-jk2lg,Uid:230ad7c3-5282-4fe6-ba51-061a5c33f268,Namespace:default,Attempt:10,} returns sandbox id \"6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9\"" Feb 13 19:47:33.458669 containerd[1877]: time="2025-02-13T19:47:33.458633689Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 19:47:33.623057 systemd-networkd[1725]: cali99fc34dfb35: Link UP Feb 13 19:47:33.623595 systemd-networkd[1725]: cali99fc34dfb35: Gained carrier Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.466 [INFO][4009] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.499 [INFO][4009] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.23.250-k8s-csi--node--driver--9th5q-eth0 csi-node-driver- calico-system b75d0e53-a7c0-48db-98b8-b75d74f4d4d5 1172 0 2025-02-13 19:47:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 172.31.23.250 csi-node-driver-9th5q eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali99fc34dfb35 [] []}} ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.500 [INFO][4009] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.550 [INFO][4042] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" HandleID="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.569 [INFO][4042] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" HandleID="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003197b0), Attrs:map[string]string{"namespace":"calico-system", "node":"172.31.23.250", "pod":"csi-node-driver-9th5q", "timestamp":"2025-02-13 19:47:33.550453456 +0000 UTC"}, Hostname:"172.31.23.250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.569 [INFO][4042] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.569 [INFO][4042] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.569 [INFO][4042] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.23.250' Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.573 [INFO][4042] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.582 [INFO][4042] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.593 [INFO][4042] ipam/ipam.go 489: Trying affinity for 192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.597 [INFO][4042] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.600 [INFO][4042] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.600 [INFO][4042] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.0/26 handle="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.602 [INFO][4042] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1 Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.607 [INFO][4042] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.0/26 handle="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.618 [INFO][4042] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.2/26] block=192.168.50.0/26 handle="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.618 [INFO][4042] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.2/26] handle="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" host="172.31.23.250" Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.618 [INFO][4042] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:47:33.638688 containerd[1877]: 2025-02-13 19:47:33.618 [INFO][4042] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.2/26] IPv6=[] ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" HandleID="k8s-pod-network.8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Workload="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.619 [INFO][4009] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-csi--node--driver--9th5q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5", ResourceVersion:"1172", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"", Pod:"csi-node-driver-9th5q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali99fc34dfb35", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.620 [INFO][4009] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.2/32] ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.620 [INFO][4009] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali99fc34dfb35 ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.623 [INFO][4009] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.624 [INFO][4009] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-csi--node--driver--9th5q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b75d0e53-a7c0-48db-98b8-b75d74f4d4d5", ResourceVersion:"1172", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1", Pod:"csi-node-driver-9th5q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali99fc34dfb35", MAC:"0a:4b:61:5f:32:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:33.640076 containerd[1877]: 2025-02-13 19:47:33.637 [INFO][4009] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1" Namespace="calico-system" Pod="csi-node-driver-9th5q" WorkloadEndpoint="172.31.23.250-k8s-csi--node--driver--9th5q-eth0" Feb 13 19:47:33.666543 containerd[1877]: time="2025-02-13T19:47:33.665356329Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:47:33.666543 containerd[1877]: time="2025-02-13T19:47:33.665428068Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:47:33.666543 containerd[1877]: time="2025-02-13T19:47:33.665456419Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:33.666543 containerd[1877]: time="2025-02-13T19:47:33.665540865Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:33.671243 kubelet[2366]: E0213 19:47:33.670796 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:33.716658 systemd[1]: Started cri-containerd-8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1.scope - libcontainer container 8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1. Feb 13 19:47:33.764653 containerd[1877]: time="2025-02-13T19:47:33.764605699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9th5q,Uid:b75d0e53-a7c0-48db-98b8-b75d74f4d4d5,Namespace:calico-system,Attempt:11,} returns sandbox id \"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1\"" Feb 13 19:47:34.332415 systemd-networkd[1725]: cali95e4e41534f: Gained IPv6LL Feb 13 19:47:34.510759 kernel: bpftool[4204]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 19:47:34.673904 kubelet[2366]: E0213 19:47:34.673831 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:34.983638 systemd-networkd[1725]: vxlan.calico: Link UP Feb 13 19:47:34.983649 systemd-networkd[1725]: vxlan.calico: Gained carrier Feb 13 19:47:34.992619 (udev-worker)[3489]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:47:35.481018 systemd-networkd[1725]: cali99fc34dfb35: Gained IPv6LL Feb 13 19:47:35.677166 kubelet[2366]: E0213 19:47:35.676830 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:36.441552 systemd-networkd[1725]: vxlan.calico: Gained IPv6LL Feb 13 19:47:36.678059 kubelet[2366]: E0213 19:47:36.677982 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:37.130279 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2160209541.mount: Deactivated successfully. Feb 13 19:47:37.678287 kubelet[2366]: E0213 19:47:37.678251 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:38.679901 kubelet[2366]: E0213 19:47:38.679847 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:38.966644 containerd[1877]: time="2025-02-13T19:47:38.966278695Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:38.968716 containerd[1877]: time="2025-02-13T19:47:38.968431528Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=73054493" Feb 13 19:47:38.971442 containerd[1877]: time="2025-02-13T19:47:38.970796785Z" level=info msg="ImageCreate event name:\"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:38.982392 containerd[1877]: time="2025-02-13T19:47:38.982012547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:38.983224 containerd[1877]: time="2025-02-13T19:47:38.983177574Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 5.524316114s" Feb 13 19:47:38.983343 containerd[1877]: time="2025-02-13T19:47:38.983227918Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 19:47:38.985711 containerd[1877]: time="2025-02-13T19:47:38.985479358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 19:47:38.996272 containerd[1877]: time="2025-02-13T19:47:38.996228684Z" level=info msg="CreateContainer within sandbox \"6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Feb 13 19:47:39.026619 ntpd[1854]: Listen normally on 6 vxlan.calico 192.168.50.0:123 Feb 13 19:47:39.029870 ntpd[1854]: 13 Feb 19:47:39 ntpd[1854]: Listen normally on 6 vxlan.calico 192.168.50.0:123 Feb 13 19:47:39.029870 ntpd[1854]: 13 Feb 19:47:39 ntpd[1854]: Listen normally on 7 cali95e4e41534f [fe80::ecee:eeff:feee:eeee%3]:123 Feb 13 19:47:39.029870 ntpd[1854]: 13 Feb 19:47:39 ntpd[1854]: Listen normally on 8 cali99fc34dfb35 [fe80::ecee:eeff:feee:eeee%4]:123 Feb 13 19:47:39.029870 ntpd[1854]: 13 Feb 19:47:39 ntpd[1854]: Listen normally on 9 vxlan.calico [fe80::64d9:90ff:fed3:2643%5]:123 Feb 13 19:47:39.026704 ntpd[1854]: Listen normally on 7 cali95e4e41534f [fe80::ecee:eeff:feee:eeee%3]:123 Feb 13 19:47:39.026780 ntpd[1854]: Listen normally on 8 cali99fc34dfb35 [fe80::ecee:eeff:feee:eeee%4]:123 Feb 13 19:47:39.026821 ntpd[1854]: Listen normally on 9 vxlan.calico [fe80::64d9:90ff:fed3:2643%5]:123 Feb 13 19:47:39.034378 containerd[1877]: time="2025-02-13T19:47:39.034334868Z" level=info msg="CreateContainer within sandbox \"6709b53659c330e63a83b8d6f23fb0cad25f489620869facc63f1bdfbde606b9\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"c261f232a4d7755ed4311e811130ddf8fe54e092d37f00e4438c8de14d257326\"" Feb 13 19:47:39.042437 containerd[1877]: time="2025-02-13T19:47:39.042284250Z" level=info msg="StartContainer for \"c261f232a4d7755ed4311e811130ddf8fe54e092d37f00e4438c8de14d257326\"" Feb 13 19:47:39.107979 systemd[1]: Started cri-containerd-c261f232a4d7755ed4311e811130ddf8fe54e092d37f00e4438c8de14d257326.scope - libcontainer container c261f232a4d7755ed4311e811130ddf8fe54e092d37f00e4438c8de14d257326. Feb 13 19:47:39.165507 containerd[1877]: time="2025-02-13T19:47:39.164301135Z" level=info msg="StartContainer for \"c261f232a4d7755ed4311e811130ddf8fe54e092d37f00e4438c8de14d257326\" returns successfully" Feb 13 19:47:39.429748 kubelet[2366]: I0213 19:47:39.429644 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-85f456d6dd-jk2lg" podStartSLOduration=11.902187663 podStartE2EDuration="17.429629903s" podCreationTimestamp="2025-02-13 19:47:22 +0000 UTC" firstStartedPulling="2025-02-13 19:47:33.457771712 +0000 UTC m=+28.036752973" lastFinishedPulling="2025-02-13 19:47:38.985213945 +0000 UTC m=+33.564195213" observedRunningTime="2025-02-13 19:47:39.429466656 +0000 UTC m=+34.008447926" watchObservedRunningTime="2025-02-13 19:47:39.429629903 +0000 UTC m=+34.008611176" Feb 13 19:47:39.681159 kubelet[2366]: E0213 19:47:39.680959 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:40.526258 containerd[1877]: time="2025-02-13T19:47:40.526203771Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:40.528423 containerd[1877]: time="2025-02-13T19:47:40.528345249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 19:47:40.533701 containerd[1877]: time="2025-02-13T19:47:40.530113072Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:40.536778 containerd[1877]: time="2025-02-13T19:47:40.535638844Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:40.538498 containerd[1877]: time="2025-02-13T19:47:40.537901756Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.552382668s" Feb 13 19:47:40.538498 containerd[1877]: time="2025-02-13T19:47:40.537947916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 19:47:40.541967 containerd[1877]: time="2025-02-13T19:47:40.541931772Z" level=info msg="CreateContainer within sandbox \"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 19:47:40.566795 containerd[1877]: time="2025-02-13T19:47:40.566722450Z" level=info msg="CreateContainer within sandbox \"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"06863a857122ab2a99e9e35b1095d392dc2ea1ef1cb9f411d1242556c39d9672\"" Feb 13 19:47:40.569086 containerd[1877]: time="2025-02-13T19:47:40.567484218Z" level=info msg="StartContainer for \"06863a857122ab2a99e9e35b1095d392dc2ea1ef1cb9f411d1242556c39d9672\"" Feb 13 19:47:40.656132 systemd[1]: Started cri-containerd-06863a857122ab2a99e9e35b1095d392dc2ea1ef1cb9f411d1242556c39d9672.scope - libcontainer container 06863a857122ab2a99e9e35b1095d392dc2ea1ef1cb9f411d1242556c39d9672. Feb 13 19:47:40.688252 kubelet[2366]: E0213 19:47:40.682990 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:40.728212 containerd[1877]: time="2025-02-13T19:47:40.728162696Z" level=info msg="StartContainer for \"06863a857122ab2a99e9e35b1095d392dc2ea1ef1cb9f411d1242556c39d9672\" returns successfully" Feb 13 19:47:40.732413 containerd[1877]: time="2025-02-13T19:47:40.732365915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 19:47:41.687412 kubelet[2366]: E0213 19:47:41.687347 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:42.427780 containerd[1877]: time="2025-02-13T19:47:42.427708111Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:42.430614 containerd[1877]: time="2025-02-13T19:47:42.430551996Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 19:47:42.444598 containerd[1877]: time="2025-02-13T19:47:42.444555876Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:42.447682 containerd[1877]: time="2025-02-13T19:47:42.447619722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:42.448559 containerd[1877]: time="2025-02-13T19:47:42.448520307Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.716109364s" Feb 13 19:47:42.448559 containerd[1877]: time="2025-02-13T19:47:42.448555976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 19:47:42.450975 containerd[1877]: time="2025-02-13T19:47:42.450944412Z" level=info msg="CreateContainer within sandbox \"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 19:47:42.467233 containerd[1877]: time="2025-02-13T19:47:42.467191774Z" level=info msg="CreateContainer within sandbox \"8aadc211e3857f5eb8afe05a80712c2b2e56d0250aa433bdbaad5d0b7f87bde1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9269c8f7dae315c70977dedb51722a8743fe8b08888b50836f61ce03ea42c4ee\"" Feb 13 19:47:42.467855 containerd[1877]: time="2025-02-13T19:47:42.467795455Z" level=info msg="StartContainer for \"9269c8f7dae315c70977dedb51722a8743fe8b08888b50836f61ce03ea42c4ee\"" Feb 13 19:47:42.524972 systemd[1]: Started cri-containerd-9269c8f7dae315c70977dedb51722a8743fe8b08888b50836f61ce03ea42c4ee.scope - libcontainer container 9269c8f7dae315c70977dedb51722a8743fe8b08888b50836f61ce03ea42c4ee. Feb 13 19:47:42.563160 containerd[1877]: time="2025-02-13T19:47:42.563104617Z" level=info msg="StartContainer for \"9269c8f7dae315c70977dedb51722a8743fe8b08888b50836f61ce03ea42c4ee\" returns successfully" Feb 13 19:47:42.688419 kubelet[2366]: E0213 19:47:42.688239 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:42.809312 kubelet[2366]: I0213 19:47:42.809270 2366 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 19:47:42.809312 kubelet[2366]: I0213 19:47:42.809310 2366 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 19:47:43.466107 kubelet[2366]: I0213 19:47:43.466047 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9th5q" podStartSLOduration=27.782757389 podStartE2EDuration="36.466031209s" podCreationTimestamp="2025-02-13 19:47:07 +0000 UTC" firstStartedPulling="2025-02-13 19:47:33.766250633 +0000 UTC m=+28.345231887" lastFinishedPulling="2025-02-13 19:47:42.449524451 +0000 UTC m=+37.028505707" observedRunningTime="2025-02-13 19:47:43.465479249 +0000 UTC m=+38.044460503" watchObservedRunningTime="2025-02-13 19:47:43.466031209 +0000 UTC m=+38.045012484" Feb 13 19:47:43.689489 kubelet[2366]: E0213 19:47:43.689450 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:44.690609 kubelet[2366]: E0213 19:47:44.690550 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:45.693016 kubelet[2366]: E0213 19:47:45.691463 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:46.637551 kubelet[2366]: E0213 19:47:46.637333 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:46.693313 kubelet[2366]: E0213 19:47:46.693254 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:46.925478 kubelet[2366]: I0213 19:47:46.925236 2366 topology_manager.go:215] "Topology Admit Handler" podUID="73edf7c4-69e3-4fb6-875e-4fcfd4956ebc" podNamespace="default" podName="nfs-server-provisioner-0" Feb 13 19:47:46.932053 systemd[1]: Created slice kubepods-besteffort-pod73edf7c4_69e3_4fb6_875e_4fcfd4956ebc.slice - libcontainer container kubepods-besteffort-pod73edf7c4_69e3_4fb6_875e_4fcfd4956ebc.slice. Feb 13 19:47:47.085485 kubelet[2366]: I0213 19:47:47.085438 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx9qz\" (UniqueName: \"kubernetes.io/projected/73edf7c4-69e3-4fb6-875e-4fcfd4956ebc-kube-api-access-gx9qz\") pod \"nfs-server-provisioner-0\" (UID: \"73edf7c4-69e3-4fb6-875e-4fcfd4956ebc\") " pod="default/nfs-server-provisioner-0" Feb 13 19:47:47.085485 kubelet[2366]: I0213 19:47:47.085491 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/73edf7c4-69e3-4fb6-875e-4fcfd4956ebc-data\") pod \"nfs-server-provisioner-0\" (UID: \"73edf7c4-69e3-4fb6-875e-4fcfd4956ebc\") " pod="default/nfs-server-provisioner-0" Feb 13 19:47:47.238580 containerd[1877]: time="2025-02-13T19:47:47.238438305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:73edf7c4-69e3-4fb6-875e-4fcfd4956ebc,Namespace:default,Attempt:0,}" Feb 13 19:47:47.448901 systemd-networkd[1725]: cali60e51b789ff: Link UP Feb 13 19:47:47.453591 systemd-networkd[1725]: cali60e51b789ff: Gained carrier Feb 13 19:47:47.461690 (udev-worker)[4487]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.314 [INFO][4469] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.23.250-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 73edf7c4-69e3-4fb6-875e-4fcfd4956ebc 1264 0 2025-02-13 19:47:46 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 172.31.23.250 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.314 [INFO][4469] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.366 [INFO][4480] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" HandleID="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Workload="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.383 [INFO][4480] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" HandleID="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Workload="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332e70), Attrs:map[string]string{"namespace":"default", "node":"172.31.23.250", "pod":"nfs-server-provisioner-0", "timestamp":"2025-02-13 19:47:47.36650707 +0000 UTC"}, Hostname:"172.31.23.250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.384 [INFO][4480] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.384 [INFO][4480] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.384 [INFO][4480] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.23.250' Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.389 [INFO][4480] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.396 [INFO][4480] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.406 [INFO][4480] ipam/ipam.go 489: Trying affinity for 192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.409 [INFO][4480] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.415 [INFO][4480] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.415 [INFO][4480] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.0/26 handle="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.418 [INFO][4480] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.426 [INFO][4480] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.0/26 handle="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.440 [INFO][4480] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.3/26] block=192.168.50.0/26 handle="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.440 [INFO][4480] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.3/26] handle="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" host="172.31.23.250" Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.440 [INFO][4480] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:47:47.491134 containerd[1877]: 2025-02-13 19:47:47.441 [INFO][4480] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.3/26] IPv6=[] ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" HandleID="k8s-pod-network.b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Workload="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.493623 containerd[1877]: 2025-02-13 19:47:47.443 [INFO][4469] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"73edf7c4-69e3-4fb6-875e-4fcfd4956ebc", ResourceVersion:"1264", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.50.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:47.493623 containerd[1877]: 2025-02-13 19:47:47.443 [INFO][4469] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.3/32] ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.493623 containerd[1877]: 2025-02-13 19:47:47.443 [INFO][4469] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.493623 containerd[1877]: 2025-02-13 19:47:47.454 [INFO][4469] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.494057 containerd[1877]: 2025-02-13 19:47:47.456 [INFO][4469] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"73edf7c4-69e3-4fb6-875e-4fcfd4956ebc", ResourceVersion:"1264", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.50.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"6a:54:eb:5f:48:c8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:47:47.494057 containerd[1877]: 2025-02-13 19:47:47.485 [INFO][4469] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="172.31.23.250-k8s-nfs--server--provisioner--0-eth0" Feb 13 19:47:47.526875 containerd[1877]: time="2025-02-13T19:47:47.526596805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:47:47.526875 containerd[1877]: time="2025-02-13T19:47:47.526663618Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:47:47.526875 containerd[1877]: time="2025-02-13T19:47:47.526686050Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:47.527639 containerd[1877]: time="2025-02-13T19:47:47.527540812Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:47:47.564984 systemd[1]: Started cri-containerd-b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b.scope - libcontainer container b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b. Feb 13 19:47:47.638471 containerd[1877]: time="2025-02-13T19:47:47.638429056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:73edf7c4-69e3-4fb6-875e-4fcfd4956ebc,Namespace:default,Attempt:0,} returns sandbox id \"b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b\"" Feb 13 19:47:47.641146 containerd[1877]: time="2025-02-13T19:47:47.641069856Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Feb 13 19:47:47.694389 kubelet[2366]: E0213 19:47:47.694115 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:48.698682 kubelet[2366]: E0213 19:47:48.694623 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:48.987375 systemd-networkd[1725]: cali60e51b789ff: Gained IPv6LL Feb 13 19:47:49.696452 kubelet[2366]: E0213 19:47:49.696414 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:50.697608 kubelet[2366]: E0213 19:47:50.696831 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:50.801970 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2366782361.mount: Deactivated successfully. Feb 13 19:47:51.027989 ntpd[1854]: Listen normally on 10 cali60e51b789ff [fe80::ecee:eeff:feee:eeee%8]:123 Feb 13 19:47:51.031904 ntpd[1854]: 13 Feb 19:47:51 ntpd[1854]: Listen normally on 10 cali60e51b789ff [fe80::ecee:eeff:feee:eeee%8]:123 Feb 13 19:47:51.697648 kubelet[2366]: E0213 19:47:51.697599 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:52.699362 kubelet[2366]: E0213 19:47:52.699318 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:53.705178 kubelet[2366]: E0213 19:47:53.705127 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:54.130171 containerd[1877]: time="2025-02-13T19:47:54.130114923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:54.132152 containerd[1877]: time="2025-02-13T19:47:54.131955657Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039406" Feb 13 19:47:54.135754 containerd[1877]: time="2025-02-13T19:47:54.134145401Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:54.138279 containerd[1877]: time="2025-02-13T19:47:54.138209979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:47:54.139463 containerd[1877]: time="2025-02-13T19:47:54.139419576Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 6.498308293s" Feb 13 19:47:54.139720 containerd[1877]: time="2025-02-13T19:47:54.139609639Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Feb 13 19:47:54.143033 containerd[1877]: time="2025-02-13T19:47:54.142992615Z" level=info msg="CreateContainer within sandbox \"b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Feb 13 19:47:54.170847 containerd[1877]: time="2025-02-13T19:47:54.170801741Z" level=info msg="CreateContainer within sandbox \"b06d7ae69b96ce6adf0b53c4b205705b81eaeda24a56ac0c156bfff110a8588b\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"0834b2703d8bafa452622d807399cfaa53fa1db07008e2b0cdbe57d3945c070f\"" Feb 13 19:47:54.171721 containerd[1877]: time="2025-02-13T19:47:54.171688122Z" level=info msg="StartContainer for \"0834b2703d8bafa452622d807399cfaa53fa1db07008e2b0cdbe57d3945c070f\"" Feb 13 19:47:54.219001 systemd[1]: Started cri-containerd-0834b2703d8bafa452622d807399cfaa53fa1db07008e2b0cdbe57d3945c070f.scope - libcontainer container 0834b2703d8bafa452622d807399cfaa53fa1db07008e2b0cdbe57d3945c070f. Feb 13 19:47:54.269891 containerd[1877]: time="2025-02-13T19:47:54.269813068Z" level=info msg="StartContainer for \"0834b2703d8bafa452622d807399cfaa53fa1db07008e2b0cdbe57d3945c070f\" returns successfully" Feb 13 19:47:54.706021 kubelet[2366]: E0213 19:47:54.705974 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:55.706906 kubelet[2366]: E0213 19:47:55.706855 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:56.708044 kubelet[2366]: E0213 19:47:56.707989 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:57.709165 kubelet[2366]: E0213 19:47:57.709110 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:58.710211 kubelet[2366]: E0213 19:47:58.710152 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:47:59.710391 kubelet[2366]: E0213 19:47:59.710344 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:00.710774 kubelet[2366]: E0213 19:48:00.710712 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:01.710927 kubelet[2366]: E0213 19:48:01.710862 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:02.616677 kubelet[2366]: I0213 19:48:02.616186 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=10.115710446 podStartE2EDuration="16.616164981s" podCreationTimestamp="2025-02-13 19:47:46 +0000 UTC" firstStartedPulling="2025-02-13 19:47:47.640372544 +0000 UTC m=+42.219353808" lastFinishedPulling="2025-02-13 19:47:54.140827088 +0000 UTC m=+48.719808343" observedRunningTime="2025-02-13 19:47:54.591918835 +0000 UTC m=+49.170900113" watchObservedRunningTime="2025-02-13 19:48:02.616164981 +0000 UTC m=+57.195146256" Feb 13 19:48:02.711153 kubelet[2366]: E0213 19:48:02.711092 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:03.712020 kubelet[2366]: E0213 19:48:03.711965 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:04.712181 kubelet[2366]: E0213 19:48:04.712127 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:05.712884 kubelet[2366]: E0213 19:48:05.712833 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:06.637351 kubelet[2366]: E0213 19:48:06.637294 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:06.681502 containerd[1877]: time="2025-02-13T19:48:06.681209160Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:48:06.683671 containerd[1877]: time="2025-02-13T19:48:06.681589983Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:48:06.683671 containerd[1877]: time="2025-02-13T19:48:06.681612451Z" level=info msg="StopPodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:48:06.707190 containerd[1877]: time="2025-02-13T19:48:06.707130930Z" level=info msg="RemovePodSandbox for \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:48:06.717267 kubelet[2366]: E0213 19:48:06.713325 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:06.732523 containerd[1877]: time="2025-02-13T19:48:06.732463218Z" level=info msg="Forcibly stopping sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\"" Feb 13 19:48:06.756709 containerd[1877]: time="2025-02-13T19:48:06.732616551Z" level=info msg="TearDown network for sandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" successfully" Feb 13 19:48:06.794393 containerd[1877]: time="2025-02-13T19:48:06.792608631Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.794393 containerd[1877]: time="2025-02-13T19:48:06.792703726Z" level=info msg="RemovePodSandbox \"6ce7fc52f60ac56dbdbd764d0f5f45e2faf49b6597cc5ad88bee180c5a5fa081\" returns successfully" Feb 13 19:48:06.794393 containerd[1877]: time="2025-02-13T19:48:06.794205490Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:48:06.829559 containerd[1877]: time="2025-02-13T19:48:06.829509710Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:48:06.829776 containerd[1877]: time="2025-02-13T19:48:06.829751137Z" level=info msg="StopPodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:48:06.830413 containerd[1877]: time="2025-02-13T19:48:06.830389351Z" level=info msg="RemovePodSandbox for \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:48:06.830563 containerd[1877]: time="2025-02-13T19:48:06.830545922Z" level=info msg="Forcibly stopping sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\"" Feb 13 19:48:06.832326 containerd[1877]: time="2025-02-13T19:48:06.830763198Z" level=info msg="TearDown network for sandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" successfully" Feb 13 19:48:06.845682 containerd[1877]: time="2025-02-13T19:48:06.845633801Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.845952 containerd[1877]: time="2025-02-13T19:48:06.845927649Z" level=info msg="RemovePodSandbox \"7a558864b67f4e2921c793eaa024b5be2fcf44ac06083ee0d9790aa6b00562e5\" returns successfully" Feb 13 19:48:06.846740 containerd[1877]: time="2025-02-13T19:48:06.846700563Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:48:06.847149 containerd[1877]: time="2025-02-13T19:48:06.847118822Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:48:06.847256 containerd[1877]: time="2025-02-13T19:48:06.847239029Z" level=info msg="StopPodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:48:06.850942 containerd[1877]: time="2025-02-13T19:48:06.849689022Z" level=info msg="RemovePodSandbox for \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:48:06.850942 containerd[1877]: time="2025-02-13T19:48:06.850764306Z" level=info msg="Forcibly stopping sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\"" Feb 13 19:48:06.852121 containerd[1877]: time="2025-02-13T19:48:06.851812577Z" level=info msg="TearDown network for sandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" successfully" Feb 13 19:48:06.871642 containerd[1877]: time="2025-02-13T19:48:06.869048346Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.871642 containerd[1877]: time="2025-02-13T19:48:06.870423593Z" level=info msg="RemovePodSandbox \"bcb8e494832ef0b4fb7e485dca26257d5ff509e23398a80e8269f70cd3c96f43\" returns successfully" Feb 13 19:48:06.872959 containerd[1877]: time="2025-02-13T19:48:06.872831509Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:48:06.872959 containerd[1877]: time="2025-02-13T19:48:06.872954347Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:48:06.873049 containerd[1877]: time="2025-02-13T19:48:06.872968964Z" level=info msg="StopPodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:48:06.875588 containerd[1877]: time="2025-02-13T19:48:06.875543129Z" level=info msg="RemovePodSandbox for \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:48:06.876325 containerd[1877]: time="2025-02-13T19:48:06.875594623Z" level=info msg="Forcibly stopping sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\"" Feb 13 19:48:06.876325 containerd[1877]: time="2025-02-13T19:48:06.875700959Z" level=info msg="TearDown network for sandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" successfully" Feb 13 19:48:06.888778 containerd[1877]: time="2025-02-13T19:48:06.887559628Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.888778 containerd[1877]: time="2025-02-13T19:48:06.887775655Z" level=info msg="RemovePodSandbox \"fe77a00557fc71c26c626c9ecd74eda1dbf7540e884a814146dcbdf12dc17083\" returns successfully" Feb 13 19:48:06.888778 containerd[1877]: time="2025-02-13T19:48:06.888336745Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:48:06.888778 containerd[1877]: time="2025-02-13T19:48:06.888454256Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:48:06.888778 containerd[1877]: time="2025-02-13T19:48:06.888470376Z" level=info msg="StopPodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:48:06.889392 containerd[1877]: time="2025-02-13T19:48:06.889360515Z" level=info msg="RemovePodSandbox for \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:48:06.891004 containerd[1877]: time="2025-02-13T19:48:06.889395925Z" level=info msg="Forcibly stopping sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\"" Feb 13 19:48:06.891004 containerd[1877]: time="2025-02-13T19:48:06.889511986Z" level=info msg="TearDown network for sandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" successfully" Feb 13 19:48:06.901695 containerd[1877]: time="2025-02-13T19:48:06.901506967Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.901695 containerd[1877]: time="2025-02-13T19:48:06.901585038Z" level=info msg="RemovePodSandbox \"32797d359eb2148bc2816dab6739c4415f1a8b57232199d2af90f47ceb3a8f84\" returns successfully" Feb 13 19:48:06.902196 containerd[1877]: time="2025-02-13T19:48:06.902166825Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:48:06.902695 containerd[1877]: time="2025-02-13T19:48:06.902665478Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:48:06.902695 containerd[1877]: time="2025-02-13T19:48:06.902689611Z" level=info msg="StopPodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:48:06.903297 containerd[1877]: time="2025-02-13T19:48:06.903271802Z" level=info msg="RemovePodSandbox for \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:48:06.903588 containerd[1877]: time="2025-02-13T19:48:06.903558519Z" level=info msg="Forcibly stopping sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\"" Feb 13 19:48:06.903746 containerd[1877]: time="2025-02-13T19:48:06.903674214Z" level=info msg="TearDown network for sandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" successfully" Feb 13 19:48:06.913072 containerd[1877]: time="2025-02-13T19:48:06.913002873Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.913465 containerd[1877]: time="2025-02-13T19:48:06.913080599Z" level=info msg="RemovePodSandbox \"bbc74383157e386b7ea9fbd1dda2c8f29cdb5ec2be5c9c49d7a68e3640017ec6\" returns successfully" Feb 13 19:48:06.914418 containerd[1877]: time="2025-02-13T19:48:06.914376751Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:48:06.914525 containerd[1877]: time="2025-02-13T19:48:06.914500739Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:48:06.914573 containerd[1877]: time="2025-02-13T19:48:06.914525732Z" level=info msg="StopPodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:48:06.915090 containerd[1877]: time="2025-02-13T19:48:06.915060361Z" level=info msg="RemovePodSandbox for \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:48:06.915291 containerd[1877]: time="2025-02-13T19:48:06.915092978Z" level=info msg="Forcibly stopping sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\"" Feb 13 19:48:06.915364 containerd[1877]: time="2025-02-13T19:48:06.915306469Z" level=info msg="TearDown network for sandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" successfully" Feb 13 19:48:06.925136 containerd[1877]: time="2025-02-13T19:48:06.925085894Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.926152 containerd[1877]: time="2025-02-13T19:48:06.925156327Z" level=info msg="RemovePodSandbox \"3a557374fcb5ef873f3fc644321c443ec1e7bba5c02159b02a54a928c9cbb207\" returns successfully" Feb 13 19:48:06.926440 containerd[1877]: time="2025-02-13T19:48:06.926407679Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:48:06.926559 containerd[1877]: time="2025-02-13T19:48:06.926531959Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:48:06.926620 containerd[1877]: time="2025-02-13T19:48:06.926557972Z" level=info msg="StopPodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:48:06.926957 containerd[1877]: time="2025-02-13T19:48:06.926932599Z" level=info msg="RemovePodSandbox for \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:48:06.927036 containerd[1877]: time="2025-02-13T19:48:06.926964061Z" level=info msg="Forcibly stopping sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\"" Feb 13 19:48:06.927117 containerd[1877]: time="2025-02-13T19:48:06.927048630Z" level=info msg="TearDown network for sandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" successfully" Feb 13 19:48:06.932455 containerd[1877]: time="2025-02-13T19:48:06.932409847Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.933059 containerd[1877]: time="2025-02-13T19:48:06.932719425Z" level=info msg="RemovePodSandbox \"be973e49840963cade01318bf33d1753220969b13a72d902bfd2e2178521e1f9\" returns successfully" Feb 13 19:48:06.934763 containerd[1877]: time="2025-02-13T19:48:06.934566765Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:48:06.934763 containerd[1877]: time="2025-02-13T19:48:06.934682155Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:48:06.934763 containerd[1877]: time="2025-02-13T19:48:06.934696512Z" level=info msg="StopPodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:48:06.936000 containerd[1877]: time="2025-02-13T19:48:06.935946819Z" level=info msg="RemovePodSandbox for \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:48:06.936707 containerd[1877]: time="2025-02-13T19:48:06.936502828Z" level=info msg="Forcibly stopping sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\"" Feb 13 19:48:06.936707 containerd[1877]: time="2025-02-13T19:48:06.936602197Z" level=info msg="TearDown network for sandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" successfully" Feb 13 19:48:06.943013 containerd[1877]: time="2025-02-13T19:48:06.942816328Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.943013 containerd[1877]: time="2025-02-13T19:48:06.942884723Z" level=info msg="RemovePodSandbox \"7fbcc48efb9c7c76fe1bf5faa7e725fff231ea221009eb8bc9686c5a707fffcf\" returns successfully" Feb 13 19:48:06.944065 containerd[1877]: time="2025-02-13T19:48:06.943548819Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:48:06.944065 containerd[1877]: time="2025-02-13T19:48:06.943671377Z" level=info msg="TearDown network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" successfully" Feb 13 19:48:06.944065 containerd[1877]: time="2025-02-13T19:48:06.943685416Z" level=info msg="StopPodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" returns successfully" Feb 13 19:48:06.944501 containerd[1877]: time="2025-02-13T19:48:06.944473839Z" level=info msg="RemovePodSandbox for \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:48:06.944581 containerd[1877]: time="2025-02-13T19:48:06.944501729Z" level=info msg="Forcibly stopping sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\"" Feb 13 19:48:06.944638 containerd[1877]: time="2025-02-13T19:48:06.944583322Z" level=info msg="TearDown network for sandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" successfully" Feb 13 19:48:06.950126 containerd[1877]: time="2025-02-13T19:48:06.950028530Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.950126 containerd[1877]: time="2025-02-13T19:48:06.950101335Z" level=info msg="RemovePodSandbox \"4bd85ba9f16de414d8b1c090f2887346a5c2d7067f52d39b050a54634f79bc91\" returns successfully" Feb 13 19:48:06.950940 containerd[1877]: time="2025-02-13T19:48:06.950682166Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" Feb 13 19:48:06.950940 containerd[1877]: time="2025-02-13T19:48:06.950807203Z" level=info msg="TearDown network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" successfully" Feb 13 19:48:06.950940 containerd[1877]: time="2025-02-13T19:48:06.950823653Z" level=info msg="StopPodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" returns successfully" Feb 13 19:48:06.951717 containerd[1877]: time="2025-02-13T19:48:06.951418373Z" level=info msg="RemovePodSandbox for \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" Feb 13 19:48:06.951717 containerd[1877]: time="2025-02-13T19:48:06.951451986Z" level=info msg="Forcibly stopping sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\"" Feb 13 19:48:06.951717 containerd[1877]: time="2025-02-13T19:48:06.951542347Z" level=info msg="TearDown network for sandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" successfully" Feb 13 19:48:06.957528 containerd[1877]: time="2025-02-13T19:48:06.957483066Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.957658 containerd[1877]: time="2025-02-13T19:48:06.957549849Z" level=info msg="RemovePodSandbox \"94eaf45d0e3b999bc99555823bb71045240bd5fad45edb09e946e7af762bc91a\" returns successfully" Feb 13 19:48:06.958051 containerd[1877]: time="2025-02-13T19:48:06.958022284Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:48:06.958159 containerd[1877]: time="2025-02-13T19:48:06.958133248Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:48:06.958159 containerd[1877]: time="2025-02-13T19:48:06.958154095Z" level=info msg="StopPodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:48:06.958486 containerd[1877]: time="2025-02-13T19:48:06.958455627Z" level=info msg="RemovePodSandbox for \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:48:06.958486 containerd[1877]: time="2025-02-13T19:48:06.958483772Z" level=info msg="Forcibly stopping sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\"" Feb 13 19:48:06.958610 containerd[1877]: time="2025-02-13T19:48:06.958562811Z" level=info msg="TearDown network for sandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" successfully" Feb 13 19:48:06.972588 containerd[1877]: time="2025-02-13T19:48:06.972396830Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.972783 containerd[1877]: time="2025-02-13T19:48:06.972619728Z" level=info msg="RemovePodSandbox \"ef95b9fd1ccdd2300ed18b140e3d84327f3377c18352b80ca4e1ceee69bf7579\" returns successfully" Feb 13 19:48:06.973430 containerd[1877]: time="2025-02-13T19:48:06.973402536Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:48:06.973650 containerd[1877]: time="2025-02-13T19:48:06.973626928Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:48:06.973650 containerd[1877]: time="2025-02-13T19:48:06.973645664Z" level=info msg="StopPodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:48:06.978646 containerd[1877]: time="2025-02-13T19:48:06.978460467Z" level=info msg="RemovePodSandbox for \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:48:06.978813 containerd[1877]: time="2025-02-13T19:48:06.978651672Z" level=info msg="Forcibly stopping sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\"" Feb 13 19:48:06.980420 containerd[1877]: time="2025-02-13T19:48:06.979933680Z" level=info msg="TearDown network for sandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" successfully" Feb 13 19:48:06.991117 containerd[1877]: time="2025-02-13T19:48:06.990895695Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:06.991257 containerd[1877]: time="2025-02-13T19:48:06.991136270Z" level=info msg="RemovePodSandbox \"045b9feb132dd02d74fae5e6ae6a3f9fe4f4713ad4f4284ed7636b2cfca18bd6\" returns successfully" Feb 13 19:48:06.992767 containerd[1877]: time="2025-02-13T19:48:06.992715501Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:48:06.992900 containerd[1877]: time="2025-02-13T19:48:06.992860769Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:48:06.992900 containerd[1877]: time="2025-02-13T19:48:06.992876884Z" level=info msg="StopPodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:48:06.993229 containerd[1877]: time="2025-02-13T19:48:06.993202498Z" level=info msg="RemovePodSandbox for \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:48:06.993324 containerd[1877]: time="2025-02-13T19:48:06.993229863Z" level=info msg="Forcibly stopping sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\"" Feb 13 19:48:06.993368 containerd[1877]: time="2025-02-13T19:48:06.993307370Z" level=info msg="TearDown network for sandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" successfully" Feb 13 19:48:07.003244 containerd[1877]: time="2025-02-13T19:48:07.003004378Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.003244 containerd[1877]: time="2025-02-13T19:48:07.003079660Z" level=info msg="RemovePodSandbox \"e3c582d55e0fd96b2f63a79496f5971e990228e222769ecca61c9e5ddc6e2e74\" returns successfully" Feb 13 19:48:07.003758 containerd[1877]: time="2025-02-13T19:48:07.003704101Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:48:07.003886 containerd[1877]: time="2025-02-13T19:48:07.003841471Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:48:07.003886 containerd[1877]: time="2025-02-13T19:48:07.003857626Z" level=info msg="StopPodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:48:07.004214 containerd[1877]: time="2025-02-13T19:48:07.004180387Z" level=info msg="RemovePodSandbox for \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:48:07.004214 containerd[1877]: time="2025-02-13T19:48:07.004212342Z" level=info msg="Forcibly stopping sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\"" Feb 13 19:48:07.004355 containerd[1877]: time="2025-02-13T19:48:07.004289164Z" level=info msg="TearDown network for sandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" successfully" Feb 13 19:48:07.051375 containerd[1877]: time="2025-02-13T19:48:07.050117302Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.051375 containerd[1877]: time="2025-02-13T19:48:07.050201072Z" level=info msg="RemovePodSandbox \"a1eb32ce97c9e2f5b1a7a7cfbf0aa044c2f1ed83c62a96653af4b7231493fda2\" returns successfully" Feb 13 19:48:07.071794 containerd[1877]: time="2025-02-13T19:48:07.068870927Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:48:07.071794 containerd[1877]: time="2025-02-13T19:48:07.069016663Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:48:07.071794 containerd[1877]: time="2025-02-13T19:48:07.069031132Z" level=info msg="StopPodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:48:07.079363 containerd[1877]: time="2025-02-13T19:48:07.079171737Z" level=info msg="RemovePodSandbox for \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:48:07.079363 containerd[1877]: time="2025-02-13T19:48:07.079292323Z" level=info msg="Forcibly stopping sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\"" Feb 13 19:48:07.079546 containerd[1877]: time="2025-02-13T19:48:07.079398227Z" level=info msg="TearDown network for sandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" successfully" Feb 13 19:48:07.105320 containerd[1877]: time="2025-02-13T19:48:07.105262883Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.105500 containerd[1877]: time="2025-02-13T19:48:07.105335184Z" level=info msg="RemovePodSandbox \"f2c41f21f315805fde1900241cdc11ff82ae1d863052df816fe680af9ed75ff5\" returns successfully" Feb 13 19:48:07.106028 containerd[1877]: time="2025-02-13T19:48:07.105997306Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:48:07.106136 containerd[1877]: time="2025-02-13T19:48:07.106112490Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:48:07.106195 containerd[1877]: time="2025-02-13T19:48:07.106133604Z" level=info msg="StopPodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:48:07.106810 containerd[1877]: time="2025-02-13T19:48:07.106773013Z" level=info msg="RemovePodSandbox for \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:48:07.106810 containerd[1877]: time="2025-02-13T19:48:07.106807341Z" level=info msg="Forcibly stopping sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\"" Feb 13 19:48:07.106953 containerd[1877]: time="2025-02-13T19:48:07.106888737Z" level=info msg="TearDown network for sandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" successfully" Feb 13 19:48:07.127864 containerd[1877]: time="2025-02-13T19:48:07.127508490Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.127864 containerd[1877]: time="2025-02-13T19:48:07.127589805Z" level=info msg="RemovePodSandbox \"4ce8e7fcbd881f6e6745ed5c8170300db646f2d5792a5176b5d0ad7549ac4896\" returns successfully" Feb 13 19:48:07.128643 containerd[1877]: time="2025-02-13T19:48:07.128406385Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:48:07.128643 containerd[1877]: time="2025-02-13T19:48:07.128547739Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:48:07.128643 containerd[1877]: time="2025-02-13T19:48:07.128566759Z" level=info msg="StopPodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:48:07.129216 containerd[1877]: time="2025-02-13T19:48:07.129184665Z" level=info msg="RemovePodSandbox for \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:48:07.129216 containerd[1877]: time="2025-02-13T19:48:07.129213889Z" level=info msg="Forcibly stopping sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\"" Feb 13 19:48:07.129524 containerd[1877]: time="2025-02-13T19:48:07.129453766Z" level=info msg="TearDown network for sandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" successfully" Feb 13 19:48:07.132937 containerd[1877]: time="2025-02-13T19:48:07.132894940Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.133095 containerd[1877]: time="2025-02-13T19:48:07.132953425Z" level=info msg="RemovePodSandbox \"d595cabba2616a4acee68cd79ba9e7fb37f9fc8a42f5c91bafdfe67bc9e5eb97\" returns successfully" Feb 13 19:48:07.133448 containerd[1877]: time="2025-02-13T19:48:07.133423197Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:48:07.133625 containerd[1877]: time="2025-02-13T19:48:07.133598488Z" level=info msg="TearDown network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" successfully" Feb 13 19:48:07.133625 containerd[1877]: time="2025-02-13T19:48:07.133618001Z" level=info msg="StopPodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" returns successfully" Feb 13 19:48:07.135049 containerd[1877]: time="2025-02-13T19:48:07.133969866Z" level=info msg="RemovePodSandbox for \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:48:07.135049 containerd[1877]: time="2025-02-13T19:48:07.133999660Z" level=info msg="Forcibly stopping sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\"" Feb 13 19:48:07.135049 containerd[1877]: time="2025-02-13T19:48:07.134062587Z" level=info msg="TearDown network for sandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" successfully" Feb 13 19:48:07.137619 containerd[1877]: time="2025-02-13T19:48:07.137577823Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.137723 containerd[1877]: time="2025-02-13T19:48:07.137631804Z" level=info msg="RemovePodSandbox \"a55130d9e1fd1c0a1e9a9b77b2ba185e1221f25b5e62d6ad1f0428c9664931e4\" returns successfully" Feb 13 19:48:07.138200 containerd[1877]: time="2025-02-13T19:48:07.138105364Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" Feb 13 19:48:07.138287 containerd[1877]: time="2025-02-13T19:48:07.138248119Z" level=info msg="TearDown network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" successfully" Feb 13 19:48:07.138287 containerd[1877]: time="2025-02-13T19:48:07.138265638Z" level=info msg="StopPodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" returns successfully" Feb 13 19:48:07.139719 containerd[1877]: time="2025-02-13T19:48:07.138659256Z" level=info msg="RemovePodSandbox for \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" Feb 13 19:48:07.139719 containerd[1877]: time="2025-02-13T19:48:07.138680249Z" level=info msg="Forcibly stopping sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\"" Feb 13 19:48:07.139719 containerd[1877]: time="2025-02-13T19:48:07.138760691Z" level=info msg="TearDown network for sandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" successfully" Feb 13 19:48:07.141862 containerd[1877]: time="2025-02-13T19:48:07.141828563Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.141946 containerd[1877]: time="2025-02-13T19:48:07.141887037Z" level=info msg="RemovePodSandbox \"cbc1044b410e1f845bda92a1adbca074bab9fd6c0396bd4e69c58790871d3353\" returns successfully" Feb 13 19:48:07.143805 containerd[1877]: time="2025-02-13T19:48:07.143776506Z" level=info msg="StopPodSandbox for \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\"" Feb 13 19:48:07.143914 containerd[1877]: time="2025-02-13T19:48:07.143893280Z" level=info msg="TearDown network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" successfully" Feb 13 19:48:07.143976 containerd[1877]: time="2025-02-13T19:48:07.143911332Z" level=info msg="StopPodSandbox for \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" returns successfully" Feb 13 19:48:07.144255 containerd[1877]: time="2025-02-13T19:48:07.144230634Z" level=info msg="RemovePodSandbox for \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\"" Feb 13 19:48:07.144531 containerd[1877]: time="2025-02-13T19:48:07.144257293Z" level=info msg="Forcibly stopping sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\"" Feb 13 19:48:07.144591 containerd[1877]: time="2025-02-13T19:48:07.144516451Z" level=info msg="TearDown network for sandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" successfully" Feb 13 19:48:07.147583 containerd[1877]: time="2025-02-13T19:48:07.147542302Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:48:07.147694 containerd[1877]: time="2025-02-13T19:48:07.147599861Z" level=info msg="RemovePodSandbox \"c34937fbdde6b5f995539b64825166e229f7a3584c6ef404b84e210fb4727607\" returns successfully" Feb 13 19:48:07.714670 kubelet[2366]: E0213 19:48:07.714557 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:08.715119 kubelet[2366]: E0213 19:48:08.715074 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:09.716229 kubelet[2366]: E0213 19:48:09.716180 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:10.716636 kubelet[2366]: E0213 19:48:10.716526 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:11.717172 kubelet[2366]: E0213 19:48:11.717116 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:12.718475 kubelet[2366]: E0213 19:48:12.718258 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:13.718969 kubelet[2366]: E0213 19:48:13.718907 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:14.719953 kubelet[2366]: E0213 19:48:14.719896 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:15.720539 kubelet[2366]: E0213 19:48:15.720482 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:16.721230 kubelet[2366]: E0213 19:48:16.721147 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:17.722222 kubelet[2366]: E0213 19:48:17.722157 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:18.723022 kubelet[2366]: E0213 19:48:18.722963 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:18.938114 kubelet[2366]: I0213 19:48:18.937537 2366 topology_manager.go:215] "Topology Admit Handler" podUID="381137dd-005d-4012-a065-19e1428a6f75" podNamespace="default" podName="test-pod-1" Feb 13 19:48:18.972240 systemd[1]: Created slice kubepods-besteffort-pod381137dd_005d_4012_a065_19e1428a6f75.slice - libcontainer container kubepods-besteffort-pod381137dd_005d_4012_a065_19e1428a6f75.slice. Feb 13 19:48:19.027144 kubelet[2366]: I0213 19:48:19.026819 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bnt\" (UniqueName: \"kubernetes.io/projected/381137dd-005d-4012-a065-19e1428a6f75-kube-api-access-49bnt\") pod \"test-pod-1\" (UID: \"381137dd-005d-4012-a065-19e1428a6f75\") " pod="default/test-pod-1" Feb 13 19:48:19.027144 kubelet[2366]: I0213 19:48:19.026880 2366 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d147dfe7-8d4d-48bd-9223-4074573058cc\" (UniqueName: \"kubernetes.io/nfs/381137dd-005d-4012-a065-19e1428a6f75-pvc-d147dfe7-8d4d-48bd-9223-4074573058cc\") pod \"test-pod-1\" (UID: \"381137dd-005d-4012-a065-19e1428a6f75\") " pod="default/test-pod-1" Feb 13 19:48:19.223771 kernel: FS-Cache: Loaded Feb 13 19:48:19.399172 kernel: RPC: Registered named UNIX socket transport module. Feb 13 19:48:19.399303 kernel: RPC: Registered udp transport module. Feb 13 19:48:19.399331 kernel: RPC: Registered tcp transport module. Feb 13 19:48:19.399358 kernel: RPC: Registered tcp-with-tls transport module. Feb 13 19:48:19.399936 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 13 19:48:19.725516 kubelet[2366]: E0213 19:48:19.723305 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:19.920336 kernel: NFS: Registering the id_resolver key type Feb 13 19:48:19.920470 kernel: Key type id_resolver registered Feb 13 19:48:19.920500 kernel: Key type id_legacy registered Feb 13 19:48:20.077053 nfsidmap[4702]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'us-west-2.compute.internal' Feb 13 19:48:20.083095 nfsidmap[4703]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'us-west-2.compute.internal' Feb 13 19:48:20.185177 containerd[1877]: time="2025-02-13T19:48:20.185127775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:381137dd-005d-4012-a065-19e1428a6f75,Namespace:default,Attempt:0,}" Feb 13 19:48:20.440357 (udev-worker)[4699]: Network interface NamePolicy= disabled on kernel command line. Feb 13 19:48:20.441827 systemd-networkd[1725]: cali5ec59c6bf6e: Link UP Feb 13 19:48:20.444449 systemd-networkd[1725]: cali5ec59c6bf6e: Gained carrier Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.311 [INFO][4705] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {172.31.23.250-k8s-test--pod--1-eth0 default 381137dd-005d-4012-a065-19e1428a6f75 1367 0 2025-02-13 19:47:48 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 172.31.23.250 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.312 [INFO][4705] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.358 [INFO][4715] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" HandleID="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Workload="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.372 [INFO][4715] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" HandleID="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Workload="172.31.23.250-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000293e10), Attrs:map[string]string{"namespace":"default", "node":"172.31.23.250", "pod":"test-pod-1", "timestamp":"2025-02-13 19:48:20.358496823 +0000 UTC"}, Hostname:"172.31.23.250", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.372 [INFO][4715] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.372 [INFO][4715] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.372 [INFO][4715] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '172.31.23.250' Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.376 [INFO][4715] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.394 [INFO][4715] ipam/ipam.go 372: Looking up existing affinities for host host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.404 [INFO][4715] ipam/ipam.go 489: Trying affinity for 192.168.50.0/26 host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.410 [INFO][4715] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.414 [INFO][4715] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.0/26 host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.414 [INFO][4715] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.0/26 handle="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.418 [INFO][4715] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.425 [INFO][4715] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.0/26 handle="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.436 [INFO][4715] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.4/26] block=192.168.50.0/26 handle="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.436 [INFO][4715] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.4/26] handle="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" host="172.31.23.250" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.436 [INFO][4715] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.436 [INFO][4715] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.4/26] IPv6=[] ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" HandleID="k8s-pod-network.2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Workload="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.460680 containerd[1877]: 2025-02-13 19:48:20.438 [INFO][4705] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"381137dd-005d-4012-a065-19e1428a6f75", ResourceVersion:"1367", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.50.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:48:20.462557 containerd[1877]: 2025-02-13 19:48:20.438 [INFO][4705] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.4/32] ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.462557 containerd[1877]: 2025-02-13 19:48:20.438 [INFO][4705] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.462557 containerd[1877]: 2025-02-13 19:48:20.445 [INFO][4705] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.462557 containerd[1877]: 2025-02-13 19:48:20.445 [INFO][4705] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"172.31.23.250-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"381137dd-005d-4012-a065-19e1428a6f75", ResourceVersion:"1367", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 47, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"172.31.23.250", ContainerID:"2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.50.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"b2:fd:4f:00:ff:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:48:20.462557 containerd[1877]: 2025-02-13 19:48:20.458 [INFO][4705] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="172.31.23.250-k8s-test--pod--1-eth0" Feb 13 19:48:20.510950 containerd[1877]: time="2025-02-13T19:48:20.510789855Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:48:20.510950 containerd[1877]: time="2025-02-13T19:48:20.510879550Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:48:20.510950 containerd[1877]: time="2025-02-13T19:48:20.510905685Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:48:20.511519 containerd[1877]: time="2025-02-13T19:48:20.511023902Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:48:20.550959 systemd[1]: Started cri-containerd-2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c.scope - libcontainer container 2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c. Feb 13 19:48:20.606919 containerd[1877]: time="2025-02-13T19:48:20.606858840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:381137dd-005d-4012-a065-19e1428a6f75,Namespace:default,Attempt:0,} returns sandbox id \"2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c\"" Feb 13 19:48:20.612573 containerd[1877]: time="2025-02-13T19:48:20.612537958Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 19:48:20.724215 kubelet[2366]: E0213 19:48:20.724093 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:21.011920 containerd[1877]: time="2025-02-13T19:48:21.011700510Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:48:21.014542 containerd[1877]: time="2025-02-13T19:48:21.013115132Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Feb 13 19:48:21.022357 containerd[1877]: time="2025-02-13T19:48:21.022300332Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 409.528894ms" Feb 13 19:48:21.022357 containerd[1877]: time="2025-02-13T19:48:21.022349867Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 19:48:21.028361 containerd[1877]: time="2025-02-13T19:48:21.028309884Z" level=info msg="CreateContainer within sandbox \"2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c\" for container &ContainerMetadata{Name:test,Attempt:0,}" Feb 13 19:48:21.069699 containerd[1877]: time="2025-02-13T19:48:21.069403352Z" level=info msg="CreateContainer within sandbox \"2af05702cd3f61437e2ebaff485f1760d5f6b217dd9d485ff2c3ef9b9c0bdc4c\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"bea5e33217c3fc8fd8a52aba97414710aebd22034e4dae271a46ce8c7c1471e7\"" Feb 13 19:48:21.076641 containerd[1877]: time="2025-02-13T19:48:21.073499746Z" level=info msg="StartContainer for \"bea5e33217c3fc8fd8a52aba97414710aebd22034e4dae271a46ce8c7c1471e7\"" Feb 13 19:48:21.127007 systemd[1]: Started cri-containerd-bea5e33217c3fc8fd8a52aba97414710aebd22034e4dae271a46ce8c7c1471e7.scope - libcontainer container bea5e33217c3fc8fd8a52aba97414710aebd22034e4dae271a46ce8c7c1471e7. Feb 13 19:48:21.167435 containerd[1877]: time="2025-02-13T19:48:21.167260442Z" level=info msg="StartContainer for \"bea5e33217c3fc8fd8a52aba97414710aebd22034e4dae271a46ce8c7c1471e7\" returns successfully" Feb 13 19:48:21.675716 kubelet[2366]: I0213 19:48:21.675643 2366 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/test-pod-1" podStartSLOduration=33.263702973 podStartE2EDuration="33.675613744s" podCreationTimestamp="2025-02-13 19:47:48 +0000 UTC" firstStartedPulling="2025-02-13 19:48:20.6121365 +0000 UTC m=+75.191117763" lastFinishedPulling="2025-02-13 19:48:21.024047269 +0000 UTC m=+75.603028534" observedRunningTime="2025-02-13 19:48:21.675434413 +0000 UTC m=+76.254415693" watchObservedRunningTime="2025-02-13 19:48:21.675613744 +0000 UTC m=+76.254595018" Feb 13 19:48:21.724589 kubelet[2366]: E0213 19:48:21.724514 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:22.008971 systemd-networkd[1725]: cali5ec59c6bf6e: Gained IPv6LL Feb 13 19:48:22.724947 kubelet[2366]: E0213 19:48:22.724892 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:23.726134 kubelet[2366]: E0213 19:48:23.726073 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:24.025273 ntpd[1854]: Listen normally on 11 cali5ec59c6bf6e [fe80::ecee:eeff:feee:eeee%9]:123 Feb 13 19:48:24.025806 ntpd[1854]: 13 Feb 19:48:24 ntpd[1854]: Listen normally on 11 cali5ec59c6bf6e [fe80::ecee:eeff:feee:eeee%9]:123 Feb 13 19:48:24.726705 kubelet[2366]: E0213 19:48:24.726655 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:25.727085 kubelet[2366]: E0213 19:48:25.726965 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:26.636636 kubelet[2366]: E0213 19:48:26.636576 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:26.728346 kubelet[2366]: E0213 19:48:26.728190 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:27.729125 kubelet[2366]: E0213 19:48:27.729008 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:28.729308 kubelet[2366]: E0213 19:48:28.729252 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:29.730394 kubelet[2366]: E0213 19:48:29.730349 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:30.731307 kubelet[2366]: E0213 19:48:30.731245 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:31.731789 kubelet[2366]: E0213 19:48:31.731721 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:32.732006 kubelet[2366]: E0213 19:48:32.731940 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:33.732716 kubelet[2366]: E0213 19:48:33.732659 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:34.733070 kubelet[2366]: E0213 19:48:34.733017 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:35.733755 kubelet[2366]: E0213 19:48:35.733695 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:36.734289 kubelet[2366]: E0213 19:48:36.734115 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:37.735399 kubelet[2366]: E0213 19:48:37.735342 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:38.693060 kubelet[2366]: E0213 19:48:38.692986 2366 controller.go:195] "Failed to update lease" err="Put \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 13 19:48:38.736467 kubelet[2366]: E0213 19:48:38.736314 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:39.737022 kubelet[2366]: E0213 19:48:39.736958 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:40.737963 kubelet[2366]: E0213 19:48:40.737907 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:41.738810 kubelet[2366]: E0213 19:48:41.738757 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:42.739381 kubelet[2366]: E0213 19:48:42.739321 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:43.740408 kubelet[2366]: E0213 19:48:43.740355 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:44.741248 kubelet[2366]: E0213 19:48:44.741194 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:45.742195 kubelet[2366]: E0213 19:48:45.742139 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:46.637157 kubelet[2366]: E0213 19:48:46.637100 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:46.742460 kubelet[2366]: E0213 19:48:46.742306 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:47.743225 kubelet[2366]: E0213 19:48:47.743168 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:48.694223 kubelet[2366]: E0213 19:48:48.694170 2366 controller.go:195] "Failed to update lease" err="Put \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 13 19:48:48.744260 kubelet[2366]: E0213 19:48:48.744213 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:49.745220 kubelet[2366]: E0213 19:48:49.745158 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:50.745877 kubelet[2366]: E0213 19:48:50.745652 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:51.747064 kubelet[2366]: E0213 19:48:51.747023 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:52.747816 kubelet[2366]: E0213 19:48:52.747759 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:53.749017 kubelet[2366]: E0213 19:48:53.748959 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:54.750114 kubelet[2366]: E0213 19:48:54.750055 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:55.752927 kubelet[2366]: E0213 19:48:55.752869 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:56.753456 kubelet[2366]: E0213 19:48:56.753397 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:57.754544 kubelet[2366]: E0213 19:48:57.754484 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:58.695480 kubelet[2366]: E0213 19:48:58.694933 2366 controller.go:195] "Failed to update lease" err="Put \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 13 19:48:58.755439 kubelet[2366]: E0213 19:48:58.755382 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:48:59.755672 kubelet[2366]: E0213 19:48:59.755604 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:00.756497 kubelet[2366]: E0213 19:49:00.756442 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:01.757211 kubelet[2366]: E0213 19:49:01.757154 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:02.065817 systemd[1]: run-containerd-runc-k8s.io-ed1684de59595384efed381e82eceb35feed579e8897e5fa626b174451b9ec05-runc.0znNrH.mount: Deactivated successfully. Feb 13 19:49:02.757649 kubelet[2366]: E0213 19:49:02.757591 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:03.758608 kubelet[2366]: E0213 19:49:03.758411 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:04.759246 kubelet[2366]: E0213 19:49:04.759199 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:05.760117 kubelet[2366]: E0213 19:49:05.760058 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:06.636683 kubelet[2366]: E0213 19:49:06.636623 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:06.760955 kubelet[2366]: E0213 19:49:06.760913 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:07.761477 kubelet[2366]: E0213 19:49:07.761416 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:08.696613 kubelet[2366]: E0213 19:49:08.696504 2366 controller.go:195] "Failed to update lease" err="Put \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 13 19:49:08.762015 kubelet[2366]: E0213 19:49:08.761957 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:09.762968 kubelet[2366]: E0213 19:49:09.762833 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:10.763872 kubelet[2366]: E0213 19:49:10.763830 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:11.195012 kubelet[2366]: E0213 19:49:11.194967 2366 controller.go:195] "Failed to update lease" err="Put \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": unexpected EOF" Feb 13 19:49:11.195012 kubelet[2366]: I0213 19:49:11.195011 2366 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 13 19:49:11.203055 kubelet[2366]: E0213 19:49:11.196184 2366 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.21.115:6443/api/v1/namespaces/calico-system/events\": unexpected EOF" event=< Feb 13 19:49:11.203055 kubelet[2366]: &Event{ObjectMeta:{calico-node-r44cb.1823dc55d1a92b6e calico-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:calico-node-r44cb,UID:c05d6fd2-5985-4e1b-a8a3-69c183de4523,APIVersion:v1,ResourceVersion:933,FieldPath:spec.containers{calico-node},},Reason:Unhealthy,Message:Readiness probe failed: 2025-02-13 19:49:02.214 [INFO][312] node/health.go 202: Number of node(s) with BGP peering established = 0 Feb 13 19:49:11.203055 kubelet[2366]: calico/node is not ready: BIRD is not ready: BGP not established with 172.31.21.115 Feb 13 19:49:11.203055 kubelet[2366]: ,Source:EventSource{Component:kubelet,Host:172.31.23.250,},FirstTimestamp:2025-02-13 19:49:02.24299915 +0000 UTC m=+116.821980425,LastTimestamp:2025-02-13 19:49:02.24299915 +0000 UTC m=+116.821980425,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:172.31.23.250,} Feb 13 19:49:11.203055 kubelet[2366]: > Feb 13 19:49:11.764621 kubelet[2366]: E0213 19:49:11.764575 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:12.204534 kubelet[2366]: E0213 19:49:12.203718 2366 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": dial tcp 172.31.21.115:6443: connect: connection refused - error from a previous attempt: read tcp 172.31.23.250:39614->172.31.21.115:6443: read: connection reset by peer" interval="200ms" Feb 13 19:49:12.765359 kubelet[2366]: E0213 19:49:12.765274 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:13.765657 kubelet[2366]: E0213 19:49:13.765604 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:14.766346 kubelet[2366]: E0213 19:49:14.766287 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:15.767313 kubelet[2366]: E0213 19:49:15.767254 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:16.767702 kubelet[2366]: E0213 19:49:16.767643 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:17.768289 kubelet[2366]: E0213 19:49:17.768232 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:18.769045 kubelet[2366]: E0213 19:49:18.768979 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:19.769530 kubelet[2366]: E0213 19:49:19.769473 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:20.770744 kubelet[2366]: E0213 19:49:20.770675 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:21.771913 kubelet[2366]: E0213 19:49:21.771853 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:22.405051 kubelet[2366]: E0213 19:49:22.404992 2366 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/172.31.23.250?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="400ms" Feb 13 19:49:22.773177 kubelet[2366]: E0213 19:49:22.772958 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:23.774318 kubelet[2366]: E0213 19:49:23.774050 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:24.775596 kubelet[2366]: E0213 19:49:24.775425 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:25.776132 kubelet[2366]: E0213 19:49:25.776076 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:26.637341 kubelet[2366]: E0213 19:49:26.637287 2366 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:26.776595 kubelet[2366]: E0213 19:49:26.776500 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:27.777676 kubelet[2366]: E0213 19:49:27.777562 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:28.778372 kubelet[2366]: E0213 19:49:28.778306 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:29.779567 kubelet[2366]: E0213 19:49:29.779501 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:30.072915 kubelet[2366]: E0213 19:49:30.072768 2366 kubelet_node_status.go:544] "Error updating node status, will retry" err="error getting node \"172.31.23.250\": Get \"https://172.31.21.115:6443/api/v1/nodes/172.31.23.250?resourceVersion=0&timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 13 19:49:30.779750 kubelet[2366]: E0213 19:49:30.779696 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 19:49:31.780089 kubelet[2366]: E0213 19:49:31.780028 2366 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"