Feb 13 20:11:28.057962 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 17:40:15 -00 2025 Feb 13 20:11:28.058021 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 20:11:28.058037 kernel: BIOS-provided physical RAM map: Feb 13 20:11:28.058054 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 13 20:11:28.058065 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 13 20:11:28.058076 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 13 20:11:28.058088 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Feb 13 20:11:28.058099 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Feb 13 20:11:28.058110 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Feb 13 20:11:28.058120 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Feb 13 20:11:28.058131 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 13 20:11:28.058142 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 13 20:11:28.058157 kernel: NX (Execute Disable) protection: active Feb 13 20:11:28.058169 kernel: APIC: Static calls initialized Feb 13 20:11:28.058182 kernel: SMBIOS 2.8 present. Feb 13 20:11:28.058194 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Feb 13 20:11:28.058206 kernel: Hypervisor detected: KVM Feb 13 20:11:28.058222 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 13 20:11:28.058234 kernel: kvm-clock: using sched offset of 4525683552 cycles Feb 13 20:11:28.058246 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 13 20:11:28.058259 kernel: tsc: Detected 2499.998 MHz processor Feb 13 20:11:28.058271 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 20:11:28.058283 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 20:11:28.058294 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Feb 13 20:11:28.058306 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Feb 13 20:11:28.058318 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 20:11:28.058334 kernel: Using GB pages for direct mapping Feb 13 20:11:28.058346 kernel: ACPI: Early table checksum verification disabled Feb 13 20:11:28.058358 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Feb 13 20:11:28.058370 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058382 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058394 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058406 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Feb 13 20:11:28.058417 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058429 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058445 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058457 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 20:11:28.058470 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Feb 13 20:11:28.058481 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Feb 13 20:11:28.058493 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Feb 13 20:11:28.058511 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Feb 13 20:11:28.058524 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Feb 13 20:11:28.058541 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Feb 13 20:11:28.058553 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Feb 13 20:11:28.058566 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 20:11:28.058585 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Feb 13 20:11:28.058597 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 20:11:28.058609 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Feb 13 20:11:28.058622 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 20:11:28.058634 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Feb 13 20:11:28.058651 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 20:11:28.058663 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Feb 13 20:11:28.058675 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 20:11:28.058687 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Feb 13 20:11:28.058700 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 20:11:28.058712 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Feb 13 20:11:28.058724 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 20:11:28.058736 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Feb 13 20:11:28.058749 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 20:11:28.058765 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Feb 13 20:11:28.058777 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 20:11:28.058790 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 20:11:28.058802 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Feb 13 20:11:28.058815 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Feb 13 20:11:28.058827 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Feb 13 20:11:28.058839 kernel: Zone ranges: Feb 13 20:11:28.058852 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 20:11:28.058864 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Feb 13 20:11:28.058876 kernel: Normal empty Feb 13 20:11:28.059940 kernel: Movable zone start for each node Feb 13 20:11:28.059957 kernel: Early memory node ranges Feb 13 20:11:28.059969 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 13 20:11:28.059982 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Feb 13 20:11:28.059994 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Feb 13 20:11:28.060007 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 20:11:28.060019 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 13 20:11:28.060032 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Feb 13 20:11:28.060044 kernel: ACPI: PM-Timer IO Port: 0x608 Feb 13 20:11:28.060065 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 13 20:11:28.060078 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 13 20:11:28.060090 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 20:11:28.060103 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 13 20:11:28.060115 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 20:11:28.060128 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 13 20:11:28.060140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 13 20:11:28.060153 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 20:11:28.060165 kernel: TSC deadline timer available Feb 13 20:11:28.060182 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Feb 13 20:11:28.060195 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Feb 13 20:11:28.060207 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Feb 13 20:11:28.060220 kernel: Booting paravirtualized kernel on KVM Feb 13 20:11:28.060232 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 20:11:28.060249 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Feb 13 20:11:28.060262 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 20:11:28.060274 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 20:11:28.060286 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 13 20:11:28.060303 kernel: kvm-guest: PV spinlocks enabled Feb 13 20:11:28.060316 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Feb 13 20:11:28.060330 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 20:11:28.060343 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 20:11:28.060356 kernel: random: crng init done Feb 13 20:11:28.060374 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 20:11:28.060387 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 20:11:28.060399 kernel: Fallback order for Node 0: 0 Feb 13 20:11:28.060416 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Feb 13 20:11:28.060429 kernel: Policy zone: DMA32 Feb 13 20:11:28.060442 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 20:11:28.060454 kernel: software IO TLB: area num 16. Feb 13 20:11:28.060467 kernel: Memory: 1899480K/2096616K available (14336K kernel code, 2301K rwdata, 22852K rodata, 43476K init, 1596K bss, 196876K reserved, 0K cma-reserved) Feb 13 20:11:28.060479 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 13 20:11:28.060492 kernel: Kernel/User page tables isolation: enabled Feb 13 20:11:28.060504 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 20:11:28.060517 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 20:11:28.060534 kernel: Dynamic Preempt: voluntary Feb 13 20:11:28.060546 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 20:11:28.060564 kernel: rcu: RCU event tracing is enabled. Feb 13 20:11:28.060577 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 13 20:11:28.060590 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 20:11:28.060615 kernel: Rude variant of Tasks RCU enabled. Feb 13 20:11:28.060632 kernel: Tracing variant of Tasks RCU enabled. Feb 13 20:11:28.060645 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 20:11:28.060659 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 13 20:11:28.060672 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Feb 13 20:11:28.060685 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 13 20:11:28.060698 kernel: Console: colour VGA+ 80x25 Feb 13 20:11:28.060715 kernel: printk: console [tty0] enabled Feb 13 20:11:28.060729 kernel: printk: console [ttyS0] enabled Feb 13 20:11:28.060741 kernel: ACPI: Core revision 20230628 Feb 13 20:11:28.060754 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 20:11:28.060767 kernel: x2apic enabled Feb 13 20:11:28.060785 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 20:11:28.060798 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Feb 13 20:11:28.060811 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Feb 13 20:11:28.060825 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Feb 13 20:11:28.060838 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Feb 13 20:11:28.060851 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Feb 13 20:11:28.060863 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 20:11:28.060876 kernel: Spectre V2 : Mitigation: Retpolines Feb 13 20:11:28.060889 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 20:11:28.061986 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 13 20:11:28.062000 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Feb 13 20:11:28.062013 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 20:11:28.062026 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 20:11:28.062039 kernel: MDS: Mitigation: Clear CPU buffers Feb 13 20:11:28.062052 kernel: MMIO Stale Data: Unknown: No mitigations Feb 13 20:11:28.062065 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 20:11:28.062077 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 20:11:28.062091 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 20:11:28.062103 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 20:11:28.062116 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 20:11:28.062134 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 13 20:11:28.062148 kernel: Freeing SMP alternatives memory: 32K Feb 13 20:11:28.062161 kernel: pid_max: default: 32768 minimum: 301 Feb 13 20:11:28.062174 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 20:11:28.062187 kernel: landlock: Up and running. Feb 13 20:11:28.062200 kernel: SELinux: Initializing. Feb 13 20:11:28.062213 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 20:11:28.062226 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 20:11:28.062239 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Feb 13 20:11:28.062257 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Feb 13 20:11:28.062270 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Feb 13 20:11:28.062288 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Feb 13 20:11:28.062301 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Feb 13 20:11:28.062314 kernel: signal: max sigframe size: 1776 Feb 13 20:11:28.062328 kernel: rcu: Hierarchical SRCU implementation. Feb 13 20:11:28.062341 kernel: rcu: Max phase no-delay instances is 400. Feb 13 20:11:28.062354 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 20:11:28.062367 kernel: smp: Bringing up secondary CPUs ... Feb 13 20:11:28.062382 kernel: smpboot: x86: Booting SMP configuration: Feb 13 20:11:28.062395 kernel: .... node #0, CPUs: #1 Feb 13 20:11:28.062413 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 20:11:28.062426 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 20:11:28.062439 kernel: smpboot: Max logical packages: 16 Feb 13 20:11:28.062452 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Feb 13 20:11:28.062465 kernel: devtmpfs: initialized Feb 13 20:11:28.062478 kernel: x86/mm: Memory block size: 128MB Feb 13 20:11:28.062492 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 20:11:28.062505 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 13 20:11:28.062518 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 20:11:28.062535 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 20:11:28.062549 kernel: audit: initializing netlink subsys (disabled) Feb 13 20:11:28.062562 kernel: audit: type=2000 audit(1739477486.784:1): state=initialized audit_enabled=0 res=1 Feb 13 20:11:28.062575 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 20:11:28.062588 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 20:11:28.062601 kernel: cpuidle: using governor menu Feb 13 20:11:28.062615 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 20:11:28.062628 kernel: dca service started, version 1.12.1 Feb 13 20:11:28.062641 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Feb 13 20:11:28.062659 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Feb 13 20:11:28.062672 kernel: PCI: Using configuration type 1 for base access Feb 13 20:11:28.062701 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 20:11:28.062715 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 20:11:28.062735 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 20:11:28.062749 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 20:11:28.062762 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 20:11:28.062775 kernel: ACPI: Added _OSI(Module Device) Feb 13 20:11:28.062788 kernel: ACPI: Added _OSI(Processor Device) Feb 13 20:11:28.062811 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 20:11:28.062824 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 20:11:28.062838 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 20:11:28.062851 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 20:11:28.062873 kernel: ACPI: Interpreter enabled Feb 13 20:11:28.062886 kernel: ACPI: PM: (supports S0 S5) Feb 13 20:11:28.063947 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 20:11:28.063962 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 20:11:28.063976 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 20:11:28.063996 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Feb 13 20:11:28.064010 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 13 20:11:28.064307 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 20:11:28.064498 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Feb 13 20:11:28.064674 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Feb 13 20:11:28.064694 kernel: PCI host bridge to bus 0000:00 Feb 13 20:11:28.065935 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 20:11:28.066127 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 20:11:28.066293 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 20:11:28.066454 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Feb 13 20:11:28.066613 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 20:11:28.066783 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Feb 13 20:11:28.070093 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 13 20:11:28.070344 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Feb 13 20:11:28.070557 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Feb 13 20:11:28.070746 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Feb 13 20:11:28.071005 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Feb 13 20:11:28.071179 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Feb 13 20:11:28.071349 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 20:11:28.071542 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.071719 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Feb 13 20:11:28.071933 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.072123 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Feb 13 20:11:28.072327 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.072511 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Feb 13 20:11:28.072703 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.074167 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Feb 13 20:11:28.074374 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.074550 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Feb 13 20:11:28.074766 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.075985 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Feb 13 20:11:28.076193 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.076376 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Feb 13 20:11:28.076557 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Feb 13 20:11:28.076727 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Feb 13 20:11:28.078235 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 13 20:11:28.078427 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Feb 13 20:11:28.078603 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Feb 13 20:11:28.078775 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Feb 13 20:11:28.078987 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Feb 13 20:11:28.079170 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Feb 13 20:11:28.079384 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Feb 13 20:11:28.079558 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Feb 13 20:11:28.079729 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Feb 13 20:11:28.080077 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Feb 13 20:11:28.080256 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Feb 13 20:11:28.080445 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Feb 13 20:11:28.080613 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Feb 13 20:11:28.080779 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Feb 13 20:11:28.080994 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Feb 13 20:11:28.081171 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Feb 13 20:11:28.081364 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Feb 13 20:11:28.081553 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Feb 13 20:11:28.081731 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Feb 13 20:11:28.081929 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Feb 13 20:11:28.082105 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Feb 13 20:11:28.082321 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 20:11:28.082543 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Feb 13 20:11:28.082747 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Feb 13 20:11:28.082957 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Feb 13 20:11:28.083136 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Feb 13 20:11:28.083332 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Feb 13 20:11:28.083509 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Feb 13 20:11:28.083702 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Feb 13 20:11:28.083873 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Feb 13 20:11:28.084084 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Feb 13 20:11:28.084275 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Feb 13 20:11:28.084457 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Feb 13 20:11:28.084632 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Feb 13 20:11:28.084803 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Feb 13 20:11:28.085003 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Feb 13 20:11:28.085179 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Feb 13 20:11:28.085350 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Feb 13 20:11:28.085530 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Feb 13 20:11:28.085708 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Feb 13 20:11:28.085882 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Feb 13 20:11:28.086104 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Feb 13 20:11:28.086280 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Feb 13 20:11:28.086451 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Feb 13 20:11:28.086622 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Feb 13 20:11:28.086795 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Feb 13 20:11:28.087087 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Feb 13 20:11:28.087369 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Feb 13 20:11:28.087586 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Feb 13 20:11:28.087768 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Feb 13 20:11:28.090038 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Feb 13 20:11:28.090064 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 13 20:11:28.090079 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 13 20:11:28.090093 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 20:11:28.090115 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 13 20:11:28.090128 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Feb 13 20:11:28.090142 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Feb 13 20:11:28.090155 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Feb 13 20:11:28.090168 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Feb 13 20:11:28.090182 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Feb 13 20:11:28.090195 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Feb 13 20:11:28.090208 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Feb 13 20:11:28.090221 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Feb 13 20:11:28.090239 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Feb 13 20:11:28.090253 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Feb 13 20:11:28.090266 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Feb 13 20:11:28.090279 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Feb 13 20:11:28.090293 kernel: iommu: Default domain type: Translated Feb 13 20:11:28.090306 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 20:11:28.090319 kernel: PCI: Using ACPI for IRQ routing Feb 13 20:11:28.090332 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 20:11:28.090355 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Feb 13 20:11:28.090372 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Feb 13 20:11:28.090565 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Feb 13 20:11:28.090737 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Feb 13 20:11:28.090933 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 20:11:28.090955 kernel: vgaarb: loaded Feb 13 20:11:28.090969 kernel: clocksource: Switched to clocksource kvm-clock Feb 13 20:11:28.090982 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 20:11:28.090995 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 20:11:28.091016 kernel: pnp: PnP ACPI init Feb 13 20:11:28.091193 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Feb 13 20:11:28.091214 kernel: pnp: PnP ACPI: found 5 devices Feb 13 20:11:28.091228 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 20:11:28.091242 kernel: NET: Registered PF_INET protocol family Feb 13 20:11:28.091255 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 20:11:28.091269 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 20:11:28.091282 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 20:11:28.091295 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 20:11:28.091315 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 20:11:28.091329 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 20:11:28.091342 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 20:11:28.091356 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 20:11:28.091369 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 20:11:28.091382 kernel: NET: Registered PF_XDP protocol family Feb 13 20:11:28.091551 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Feb 13 20:11:28.091721 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Feb 13 20:11:28.093945 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Feb 13 20:11:28.094148 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Feb 13 20:11:28.094341 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 20:11:28.094518 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 20:11:28.094693 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 20:11:28.094865 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 20:11:28.097093 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Feb 13 20:11:28.097279 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Feb 13 20:11:28.097456 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Feb 13 20:11:28.097632 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Feb 13 20:11:28.097806 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Feb 13 20:11:28.098028 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Feb 13 20:11:28.098204 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Feb 13 20:11:28.098386 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Feb 13 20:11:28.098595 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Feb 13 20:11:28.098781 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Feb 13 20:11:28.098992 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Feb 13 20:11:28.099161 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Feb 13 20:11:28.099328 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Feb 13 20:11:28.099495 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Feb 13 20:11:28.099662 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Feb 13 20:11:28.099830 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Feb 13 20:11:28.100625 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Feb 13 20:11:28.100802 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Feb 13 20:11:28.101062 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Feb 13 20:11:28.101234 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Feb 13 20:11:28.101401 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Feb 13 20:11:28.101577 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Feb 13 20:11:28.101752 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Feb 13 20:11:28.101947 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Feb 13 20:11:28.102117 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Feb 13 20:11:28.102285 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Feb 13 20:11:28.102456 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Feb 13 20:11:28.102624 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Feb 13 20:11:28.102792 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Feb 13 20:11:28.103021 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Feb 13 20:11:28.103191 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Feb 13 20:11:28.103367 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Feb 13 20:11:28.103544 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Feb 13 20:11:28.103724 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Feb 13 20:11:28.103891 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Feb 13 20:11:28.104116 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Feb 13 20:11:28.104292 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Feb 13 20:11:28.104459 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Feb 13 20:11:28.104625 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Feb 13 20:11:28.104791 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Feb 13 20:11:28.104986 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Feb 13 20:11:28.105154 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Feb 13 20:11:28.105315 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 20:11:28.105468 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 20:11:28.105628 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 20:11:28.105782 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Feb 13 20:11:28.105984 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Feb 13 20:11:28.106139 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Feb 13 20:11:28.106311 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Feb 13 20:11:28.106470 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Feb 13 20:11:28.106627 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Feb 13 20:11:28.106811 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Feb 13 20:11:28.107018 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Feb 13 20:11:28.107181 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Feb 13 20:11:28.107340 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Feb 13 20:11:28.107524 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Feb 13 20:11:28.107684 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Feb 13 20:11:28.107845 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Feb 13 20:11:28.108083 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Feb 13 20:11:28.108243 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Feb 13 20:11:28.108401 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Feb 13 20:11:28.108568 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Feb 13 20:11:28.108726 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Feb 13 20:11:28.108937 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Feb 13 20:11:28.109115 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Feb 13 20:11:28.109283 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Feb 13 20:11:28.109440 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Feb 13 20:11:28.109608 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Feb 13 20:11:28.109788 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Feb 13 20:11:28.109994 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Feb 13 20:11:28.110166 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Feb 13 20:11:28.110327 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Feb 13 20:11:28.110492 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Feb 13 20:11:28.110514 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Feb 13 20:11:28.110528 kernel: PCI: CLS 0 bytes, default 64 Feb 13 20:11:28.110542 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 13 20:11:28.110556 kernel: software IO TLB: mapped [mem 0x0000000073000000-0x0000000077000000] (64MB) Feb 13 20:11:28.110570 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 20:11:28.110584 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Feb 13 20:11:28.110598 kernel: Initialise system trusted keyrings Feb 13 20:11:28.110618 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 20:11:28.110632 kernel: Key type asymmetric registered Feb 13 20:11:28.110646 kernel: Asymmetric key parser 'x509' registered Feb 13 20:11:28.110659 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 20:11:28.110673 kernel: io scheduler mq-deadline registered Feb 13 20:11:28.110687 kernel: io scheduler kyber registered Feb 13 20:11:28.110701 kernel: io scheduler bfq registered Feb 13 20:11:28.110866 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Feb 13 20:11:28.111087 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Feb 13 20:11:28.111265 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.111437 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Feb 13 20:11:28.111603 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Feb 13 20:11:28.111769 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.111991 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Feb 13 20:11:28.112162 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Feb 13 20:11:28.112344 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.112512 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Feb 13 20:11:28.112678 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Feb 13 20:11:28.112845 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.113051 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Feb 13 20:11:28.113219 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Feb 13 20:11:28.113395 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.113561 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Feb 13 20:11:28.113730 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Feb 13 20:11:28.113931 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.114107 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Feb 13 20:11:28.114276 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Feb 13 20:11:28.114461 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.114632 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Feb 13 20:11:28.114800 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Feb 13 20:11:28.115025 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Feb 13 20:11:28.115047 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 20:11:28.115063 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Feb 13 20:11:28.115084 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Feb 13 20:11:28.115098 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 20:11:28.115112 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 20:11:28.115126 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 13 20:11:28.115140 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 20:11:28.115154 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 20:11:28.115336 kernel: rtc_cmos 00:03: RTC can wake from S4 Feb 13 20:11:28.115358 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 20:11:28.115517 kernel: rtc_cmos 00:03: registered as rtc0 Feb 13 20:11:28.115682 kernel: rtc_cmos 00:03: setting system clock to 2025-02-13T20:11:27 UTC (1739477487) Feb 13 20:11:28.115839 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Feb 13 20:11:28.115859 kernel: intel_pstate: CPU model not supported Feb 13 20:11:28.115873 kernel: NET: Registered PF_INET6 protocol family Feb 13 20:11:28.115887 kernel: Segment Routing with IPv6 Feb 13 20:11:28.115975 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 20:11:28.115990 kernel: NET: Registered PF_PACKET protocol family Feb 13 20:11:28.116004 kernel: Key type dns_resolver registered Feb 13 20:11:28.116024 kernel: IPI shorthand broadcast: enabled Feb 13 20:11:28.116039 kernel: sched_clock: Marking stable (1281044518, 241462016)->(1650069607, -127563073) Feb 13 20:11:28.116052 kernel: registered taskstats version 1 Feb 13 20:11:28.116066 kernel: Loading compiled-in X.509 certificates Feb 13 20:11:28.116080 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 6c364ddae48101e091a28279a8d953535f596d53' Feb 13 20:11:28.116094 kernel: Key type .fscrypt registered Feb 13 20:11:28.116107 kernel: Key type fscrypt-provisioning registered Feb 13 20:11:28.116121 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 20:11:28.116135 kernel: ima: Allocated hash algorithm: sha1 Feb 13 20:11:28.116154 kernel: ima: No architecture policies found Feb 13 20:11:28.116167 kernel: clk: Disabling unused clocks Feb 13 20:11:28.116182 kernel: Freeing unused kernel image (initmem) memory: 43476K Feb 13 20:11:28.116196 kernel: Write protecting the kernel read-only data: 38912k Feb 13 20:11:28.116218 kernel: Freeing unused kernel image (rodata/data gap) memory: 1724K Feb 13 20:11:28.116232 kernel: Run /init as init process Feb 13 20:11:28.116245 kernel: with arguments: Feb 13 20:11:28.116259 kernel: /init Feb 13 20:11:28.116280 kernel: with environment: Feb 13 20:11:28.116298 kernel: HOME=/ Feb 13 20:11:28.116311 kernel: TERM=linux Feb 13 20:11:28.116333 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 20:11:28.116357 systemd[1]: Successfully made /usr/ read-only. Feb 13 20:11:28.116377 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 20:11:28.116396 systemd[1]: Detected virtualization kvm. Feb 13 20:11:28.116410 systemd[1]: Detected architecture x86-64. Feb 13 20:11:28.116429 systemd[1]: Running in initrd. Feb 13 20:11:28.116444 systemd[1]: No hostname configured, using default hostname. Feb 13 20:11:28.116460 systemd[1]: Hostname set to . Feb 13 20:11:28.116474 systemd[1]: Initializing machine ID from VM UUID. Feb 13 20:11:28.116489 systemd[1]: Queued start job for default target initrd.target. Feb 13 20:11:28.116511 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 20:11:28.116526 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 20:11:28.116541 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 20:11:28.116561 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 20:11:28.116577 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 20:11:28.116592 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 20:11:28.116608 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 20:11:28.116624 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 20:11:28.116639 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 20:11:28.116653 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 20:11:28.116672 systemd[1]: Reached target paths.target - Path Units. Feb 13 20:11:28.116687 systemd[1]: Reached target slices.target - Slice Units. Feb 13 20:11:28.116702 systemd[1]: Reached target swap.target - Swaps. Feb 13 20:11:28.116717 systemd[1]: Reached target timers.target - Timer Units. Feb 13 20:11:28.116731 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 20:11:28.116746 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 20:11:28.116761 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 20:11:28.116776 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Feb 13 20:11:28.116790 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 20:11:28.116810 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 20:11:28.116825 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 20:11:28.116840 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 20:11:28.116855 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 20:11:28.116870 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 20:11:28.116885 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 20:11:28.116923 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 20:11:28.116941 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 20:11:28.116962 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 20:11:28.116977 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 20:11:28.116996 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 20:11:28.117011 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 20:11:28.117077 systemd-journald[202]: Collecting audit messages is disabled. Feb 13 20:11:28.117118 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 20:11:28.117134 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 20:11:28.117149 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 20:11:28.117163 kernel: Bridge firewalling registered Feb 13 20:11:28.117184 systemd-journald[202]: Journal started Feb 13 20:11:28.117232 systemd-journald[202]: Runtime Journal (/run/log/journal/c37954bb756c4470afe9a6bf496f3c9d) is 4.7M, max 37.9M, 33.2M free. Feb 13 20:11:28.041285 systemd-modules-load[203]: Inserted module 'overlay' Feb 13 20:11:28.093960 systemd-modules-load[203]: Inserted module 'br_netfilter' Feb 13 20:11:28.162964 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 20:11:28.163939 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 20:11:28.165130 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 20:11:28.166642 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 20:11:28.182180 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 20:11:28.185976 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 20:11:28.187509 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 20:11:28.202135 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 20:11:28.212031 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 20:11:28.221352 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 20:11:28.224350 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 20:11:28.226478 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 20:11:28.228235 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 20:11:28.241163 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 20:11:28.246136 dracut-cmdline[233]: dracut-dracut-053 Feb 13 20:11:28.251034 dracut-cmdline[233]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 20:11:28.292946 systemd-resolved[242]: Positive Trust Anchors: Feb 13 20:11:28.292986 systemd-resolved[242]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 20:11:28.293031 systemd-resolved[242]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 20:11:28.298306 systemd-resolved[242]: Defaulting to hostname 'linux'. Feb 13 20:11:28.300452 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 20:11:28.302478 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 20:11:28.370938 kernel: SCSI subsystem initialized Feb 13 20:11:28.382944 kernel: Loading iSCSI transport class v2.0-870. Feb 13 20:11:28.396968 kernel: iscsi: registered transport (tcp) Feb 13 20:11:28.424280 kernel: iscsi: registered transport (qla4xxx) Feb 13 20:11:28.424416 kernel: QLogic iSCSI HBA Driver Feb 13 20:11:28.481803 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 20:11:28.493255 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 20:11:28.525500 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 20:11:28.525640 kernel: device-mapper: uevent: version 1.0.3 Feb 13 20:11:28.529006 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 20:11:28.578975 kernel: raid6: sse2x4 gen() 12966 MB/s Feb 13 20:11:28.596946 kernel: raid6: sse2x2 gen() 9291 MB/s Feb 13 20:11:28.615872 kernel: raid6: sse2x1 gen() 9525 MB/s Feb 13 20:11:28.615977 kernel: raid6: using algorithm sse2x4 gen() 12966 MB/s Feb 13 20:11:28.634830 kernel: raid6: .... xor() 7447 MB/s, rmw enabled Feb 13 20:11:28.634983 kernel: raid6: using ssse3x2 recovery algorithm Feb 13 20:11:28.660948 kernel: xor: automatically using best checksumming function avx Feb 13 20:11:28.835945 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 20:11:28.851381 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 20:11:28.857265 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 20:11:28.891959 systemd-udevd[422]: Using default interface naming scheme 'v255'. Feb 13 20:11:28.900841 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 20:11:28.910521 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 20:11:28.933757 dracut-pre-trigger[433]: rd.md=0: removing MD RAID activation Feb 13 20:11:28.980222 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 20:11:28.988199 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 20:11:29.119573 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 20:11:29.129105 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 20:11:29.148083 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 20:11:29.155554 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 20:11:29.158155 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 20:11:29.160406 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 20:11:29.170178 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 20:11:29.188101 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 20:11:29.255348 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Feb 13 20:11:29.298988 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Feb 13 20:11:29.299206 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 20:11:29.299229 kernel: GPT:17805311 != 125829119 Feb 13 20:11:29.299258 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 20:11:29.299277 kernel: GPT:17805311 != 125829119 Feb 13 20:11:29.299294 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 20:11:29.299312 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 20:11:29.299329 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 20:11:29.315934 kernel: ACPI: bus type USB registered Feb 13 20:11:29.317115 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 20:11:29.317301 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 20:11:29.325403 kernel: usbcore: registered new interface driver usbfs Feb 13 20:11:29.325437 kernel: usbcore: registered new interface driver hub Feb 13 20:11:29.325457 kernel: usbcore: registered new device driver usb Feb 13 20:11:29.325184 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 20:11:29.331038 kernel: AVX version of gcm_enc/dec engaged. Feb 13 20:11:29.331068 kernel: AES CTR mode by8 optimization enabled Feb 13 20:11:29.328859 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 20:11:29.329121 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 20:11:29.330410 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 20:11:29.341342 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 20:11:29.344673 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 20:11:29.373937 kernel: libata version 3.00 loaded. Feb 13 20:11:29.418980 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (482) Feb 13 20:11:29.467918 kernel: BTRFS: device fsid 60f89c25-9096-4268-99ca-ef7992742f2b devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (483) Feb 13 20:11:29.469134 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Feb 13 20:11:29.472242 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 20:11:29.476118 kernel: ahci 0000:00:1f.2: version 3.0 Feb 13 20:11:29.544831 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Feb 13 20:11:29.544863 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Feb 13 20:11:29.545276 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Feb 13 20:11:29.545490 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Feb 13 20:11:29.545749 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Feb 13 20:11:29.546000 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Feb 13 20:11:29.546203 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Feb 13 20:11:29.546408 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Feb 13 20:11:29.546643 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Feb 13 20:11:29.546864 kernel: hub 1-0:1.0: USB hub found Feb 13 20:11:29.547156 kernel: scsi host0: ahci Feb 13 20:11:29.547372 kernel: hub 1-0:1.0: 4 ports detected Feb 13 20:11:29.547592 kernel: scsi host1: ahci Feb 13 20:11:29.547812 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Feb 13 20:11:29.548138 kernel: scsi host2: ahci Feb 13 20:11:29.548362 kernel: hub 2-0:1.0: USB hub found Feb 13 20:11:29.548594 kernel: hub 2-0:1.0: 4 ports detected Feb 13 20:11:29.548819 kernel: scsi host3: ahci Feb 13 20:11:29.549064 kernel: scsi host4: ahci Feb 13 20:11:29.549265 kernel: scsi host5: ahci Feb 13 20:11:29.549486 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Feb 13 20:11:29.549509 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Feb 13 20:11:29.549528 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Feb 13 20:11:29.549546 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Feb 13 20:11:29.549565 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Feb 13 20:11:29.549583 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Feb 13 20:11:29.516001 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Feb 13 20:11:29.531276 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 20:11:29.541793 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Feb 13 20:11:29.547918 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Feb 13 20:11:29.556099 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 20:11:29.560075 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 20:11:29.565162 disk-uuid[568]: Primary Header is updated. Feb 13 20:11:29.565162 disk-uuid[568]: Secondary Entries is updated. Feb 13 20:11:29.565162 disk-uuid[568]: Secondary Header is updated. Feb 13 20:11:29.572943 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 20:11:29.584963 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 20:11:29.584774 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 20:11:29.765298 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Feb 13 20:11:29.848928 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.857027 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.857078 kernel: ata2: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.859788 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.861672 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.863400 kernel: ata1: SATA link down (SStatus 0 SControl 300) Feb 13 20:11:29.907095 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 13 20:11:29.912990 kernel: usbcore: registered new interface driver usbhid Feb 13 20:11:29.913034 kernel: usbhid: USB HID core driver Feb 13 20:11:29.921513 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Feb 13 20:11:29.921566 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Feb 13 20:11:30.584850 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 20:11:30.586156 disk-uuid[569]: The operation has completed successfully. Feb 13 20:11:30.664245 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 20:11:30.664441 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 20:11:30.706118 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 20:11:30.710913 sh[588]: Success Feb 13 20:11:30.728929 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Feb 13 20:11:30.802452 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 20:11:30.803597 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 20:11:30.811071 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 20:11:30.835935 kernel: BTRFS info (device dm-0): first mount of filesystem 60f89c25-9096-4268-99ca-ef7992742f2b Feb 13 20:11:30.835997 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 20:11:30.836018 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 20:11:30.836038 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 20:11:30.837936 kernel: BTRFS info (device dm-0): using free space tree Feb 13 20:11:30.847492 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 20:11:30.849003 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Feb 13 20:11:30.861191 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 20:11:30.865069 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 20:11:30.883719 kernel: BTRFS info (device vda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 20:11:30.883774 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 20:11:30.886474 kernel: BTRFS info (device vda6): using free space tree Feb 13 20:11:30.892957 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 20:11:30.910933 kernel: BTRFS info (device vda6): last unmount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 20:11:30.911048 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 20:11:30.920490 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 20:11:30.929096 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 20:11:31.036528 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 20:11:31.050249 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 20:11:31.067726 ignition[691]: Ignition 2.20.0 Feb 13 20:11:31.068883 ignition[691]: Stage: fetch-offline Feb 13 20:11:31.069608 ignition[691]: no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:31.069628 ignition[691]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:31.069811 ignition[691]: parsed url from cmdline: "" Feb 13 20:11:31.069818 ignition[691]: no config URL provided Feb 13 20:11:31.069828 ignition[691]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 20:11:31.069845 ignition[691]: no config at "/usr/lib/ignition/user.ign" Feb 13 20:11:31.074290 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 20:11:31.069855 ignition[691]: failed to fetch config: resource requires networking Feb 13 20:11:31.071081 ignition[691]: Ignition finished successfully Feb 13 20:11:31.087979 systemd-networkd[777]: lo: Link UP Feb 13 20:11:31.087996 systemd-networkd[777]: lo: Gained carrier Feb 13 20:11:31.090572 systemd-networkd[777]: Enumeration completed Feb 13 20:11:31.090723 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 20:11:31.091714 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 20:11:31.091721 systemd-networkd[777]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 20:11:31.093028 systemd[1]: Reached target network.target - Network. Feb 13 20:11:31.093851 systemd-networkd[777]: eth0: Link UP Feb 13 20:11:31.093857 systemd-networkd[777]: eth0: Gained carrier Feb 13 20:11:31.093882 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 20:11:31.105016 systemd-networkd[777]: eth0: DHCPv4 address 10.244.13.70/30, gateway 10.244.13.69 acquired from 10.244.13.69 Feb 13 20:11:31.107634 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Feb 13 20:11:31.136208 ignition[782]: Ignition 2.20.0 Feb 13 20:11:31.137356 ignition[782]: Stage: fetch Feb 13 20:11:31.138271 ignition[782]: no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:31.139058 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:31.140105 ignition[782]: parsed url from cmdline: "" Feb 13 20:11:31.140113 ignition[782]: no config URL provided Feb 13 20:11:31.140124 ignition[782]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 20:11:31.140141 ignition[782]: no config at "/usr/lib/ignition/user.ign" Feb 13 20:11:31.140290 ignition[782]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Feb 13 20:11:31.140318 ignition[782]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Feb 13 20:11:31.141725 ignition[782]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Feb 13 20:11:31.163713 ignition[782]: GET result: OK Feb 13 20:11:31.164129 ignition[782]: parsing config with SHA512: b0046892ca61836975f82bcd67b1898c83da98872ef1ed0440ec425f8b0ced9ab02d5de43dc7766688f04f0370d5e391f9102ce6b0d0b87b3b804446a3bb213e Feb 13 20:11:31.167959 unknown[782]: fetched base config from "system" Feb 13 20:11:31.167975 unknown[782]: fetched base config from "system" Feb 13 20:11:31.168262 ignition[782]: fetch: fetch complete Feb 13 20:11:31.167984 unknown[782]: fetched user config from "openstack" Feb 13 20:11:31.168271 ignition[782]: fetch: fetch passed Feb 13 20:11:31.170530 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Feb 13 20:11:31.168335 ignition[782]: Ignition finished successfully Feb 13 20:11:31.192143 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 20:11:31.211180 ignition[789]: Ignition 2.20.0 Feb 13 20:11:31.211199 ignition[789]: Stage: kargs Feb 13 20:11:31.211521 ignition[789]: no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:31.211542 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:31.215554 ignition[789]: kargs: kargs passed Feb 13 20:11:31.215627 ignition[789]: Ignition finished successfully Feb 13 20:11:31.216712 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 20:11:31.222098 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 20:11:31.248549 ignition[795]: Ignition 2.20.0 Feb 13 20:11:31.248583 ignition[795]: Stage: disks Feb 13 20:11:31.248812 ignition[795]: no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:31.251800 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 20:11:31.248832 ignition[795]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:31.253659 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 20:11:31.249754 ignition[795]: disks: disks passed Feb 13 20:11:31.254514 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 20:11:31.249827 ignition[795]: Ignition finished successfully Feb 13 20:11:31.256123 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 20:11:31.257745 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 20:11:31.259079 systemd[1]: Reached target basic.target - Basic System. Feb 13 20:11:31.277153 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 20:11:31.298287 systemd-fsck[803]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 20:11:31.301726 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 20:11:31.833039 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 20:11:31.942935 kernel: EXT4-fs (vda9): mounted filesystem 157595f2-1515-4117-a2d1-73fe2ed647fc r/w with ordered data mode. Quota mode: none. Feb 13 20:11:31.944055 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 20:11:31.945426 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 20:11:31.953015 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 20:11:31.957007 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 20:11:31.958159 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 20:11:31.960363 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Feb 13 20:11:31.961226 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 20:11:31.961277 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 20:11:31.983950 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (811) Feb 13 20:11:31.984082 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 20:11:31.986837 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 20:11:32.001924 kernel: BTRFS info (device vda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 20:11:32.001995 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 20:11:32.002016 kernel: BTRFS info (device vda6): using free space tree Feb 13 20:11:32.011915 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 20:11:32.015518 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 20:11:32.094241 initrd-setup-root[840]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 20:11:32.102169 initrd-setup-root[847]: cut: /sysroot/etc/group: No such file or directory Feb 13 20:11:32.111352 initrd-setup-root[854]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 20:11:32.115695 initrd-setup-root[861]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 20:11:32.225333 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 20:11:32.232049 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 20:11:32.242139 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 20:11:32.254020 kernel: BTRFS info (device vda6): last unmount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 20:11:32.287387 ignition[928]: INFO : Ignition 2.20.0 Feb 13 20:11:32.287387 ignition[928]: INFO : Stage: mount Feb 13 20:11:32.290779 ignition[928]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:32.290779 ignition[928]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:32.290779 ignition[928]: INFO : mount: mount passed Feb 13 20:11:32.290779 ignition[928]: INFO : Ignition finished successfully Feb 13 20:11:32.290502 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 20:11:32.292079 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 20:11:32.681168 systemd-networkd[777]: eth0: Gained IPv6LL Feb 13 20:11:32.828618 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 20:11:33.710098 systemd-networkd[777]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:351:24:19ff:fef4:d46/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:351:24:19ff:fef4:d46/64 assigned by NDisc. Feb 13 20:11:33.710112 systemd-networkd[777]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Feb 13 20:11:39.130448 coreos-metadata[813]: Feb 13 20:11:39.130 WARN failed to locate config-drive, using the metadata service API instead Feb 13 20:11:39.153437 coreos-metadata[813]: Feb 13 20:11:39.153 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Feb 13 20:11:39.168484 coreos-metadata[813]: Feb 13 20:11:39.168 INFO Fetch successful Feb 13 20:11:39.169429 coreos-metadata[813]: Feb 13 20:11:39.168 INFO wrote hostname srv-cfrgh.gb1.brightbox.com to /sysroot/etc/hostname Feb 13 20:11:39.172705 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Feb 13 20:11:39.172918 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Feb 13 20:11:39.181061 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 20:11:39.203252 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 20:11:39.222963 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (946) Feb 13 20:11:39.231337 kernel: BTRFS info (device vda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 20:11:39.231398 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 20:11:39.233225 kernel: BTRFS info (device vda6): using free space tree Feb 13 20:11:39.238962 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 20:11:39.241756 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 20:11:39.271880 ignition[964]: INFO : Ignition 2.20.0 Feb 13 20:11:39.273107 ignition[964]: INFO : Stage: files Feb 13 20:11:39.273107 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:39.273107 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:39.275864 ignition[964]: DEBUG : files: compiled without relabeling support, skipping Feb 13 20:11:39.275864 ignition[964]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 20:11:39.275864 ignition[964]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 20:11:39.279610 ignition[964]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 20:11:39.280822 ignition[964]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 20:11:39.282244 unknown[964]: wrote ssh authorized keys file for user: core Feb 13 20:11:39.283270 ignition[964]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 20:11:39.284483 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 20:11:39.285705 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Feb 13 20:11:39.853226 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 20:11:41.025870 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 20:11:41.029701 ignition[964]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 20:11:41.029701 ignition[964]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 20:11:41.029701 ignition[964]: INFO : files: files passed Feb 13 20:11:41.029701 ignition[964]: INFO : Ignition finished successfully Feb 13 20:11:41.029951 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 20:11:41.046370 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 20:11:41.050328 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 20:11:41.051742 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 20:11:41.051928 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 20:11:41.081622 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 20:11:41.081622 initrd-setup-root-after-ignition[993]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 20:11:41.086370 initrd-setup-root-after-ignition[997]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 20:11:41.089373 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 20:11:41.092024 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 20:11:41.097176 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 20:11:41.138214 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 20:11:41.138422 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 20:11:41.140582 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 20:11:41.142567 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 20:11:41.143556 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 20:11:41.152329 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 20:11:41.172700 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 20:11:41.180134 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 20:11:41.193906 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 20:11:41.194812 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 20:11:41.196526 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 20:11:41.198060 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 20:11:41.198239 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 20:11:41.200015 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 20:11:41.200981 systemd[1]: Stopped target basic.target - Basic System. Feb 13 20:11:41.202553 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 20:11:41.203980 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 20:11:41.205319 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 20:11:41.207968 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 20:11:41.209511 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 20:11:41.211016 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 20:11:41.212598 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 20:11:41.214088 systemd[1]: Stopped target swap.target - Swaps. Feb 13 20:11:41.215294 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 20:11:41.215464 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 20:11:41.217211 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 20:11:41.218121 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 20:11:41.219696 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 20:11:41.219917 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 20:11:41.221210 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 20:11:41.221392 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 20:11:41.223245 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 20:11:41.223420 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 20:11:41.225320 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 20:11:41.225494 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 20:11:41.239832 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 20:11:41.240837 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 20:11:41.241281 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 20:11:41.244225 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 20:11:41.246331 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 20:11:41.247082 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 20:11:41.249470 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 20:11:41.250056 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 20:11:41.264373 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 20:11:41.264539 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 20:11:41.270641 ignition[1017]: INFO : Ignition 2.20.0 Feb 13 20:11:41.270641 ignition[1017]: INFO : Stage: umount Feb 13 20:11:41.274104 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 20:11:41.274104 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Feb 13 20:11:41.274104 ignition[1017]: INFO : umount: umount passed Feb 13 20:11:41.274104 ignition[1017]: INFO : Ignition finished successfully Feb 13 20:11:41.275258 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 20:11:41.275441 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 20:11:41.276524 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 20:11:41.276606 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 20:11:41.277396 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 20:11:41.277484 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 20:11:41.280740 systemd[1]: ignition-fetch.service: Deactivated successfully. Feb 13 20:11:41.280818 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Feb 13 20:11:41.281524 systemd[1]: Stopped target network.target - Network. Feb 13 20:11:41.284221 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 20:11:41.284298 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 20:11:41.285421 systemd[1]: Stopped target paths.target - Path Units. Feb 13 20:11:41.286050 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 20:11:41.290711 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 20:11:41.291620 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 20:11:41.292268 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 20:11:41.301098 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 20:11:41.301193 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 20:11:41.302296 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 20:11:41.302354 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 20:11:41.303060 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 20:11:41.303141 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 20:11:41.303827 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 20:11:41.303913 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 20:11:41.305033 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 20:11:41.306436 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 20:11:41.307069 systemd-networkd[777]: eth0: DHCPv6 lease lost Feb 13 20:11:41.309523 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 20:11:41.313485 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 20:11:41.313727 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 20:11:41.316580 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Feb 13 20:11:41.316968 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 20:11:41.317049 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 20:11:41.325078 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 20:11:41.328301 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 20:11:41.328420 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 20:11:41.330021 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 20:11:41.333850 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 20:11:41.334087 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 20:11:41.339435 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Feb 13 20:11:41.339949 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 20:11:41.340172 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 20:11:41.349544 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 20:11:41.349754 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 20:11:41.352213 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 20:11:41.352278 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 20:11:41.352984 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 20:11:41.353068 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 20:11:41.355360 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 20:11:41.355435 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 20:11:41.356307 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 20:11:41.356397 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 20:11:41.360107 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 20:11:41.362134 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 20:11:41.362215 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 20:11:41.363705 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 20:11:41.363778 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 20:11:41.368371 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 20:11:41.368452 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 20:11:41.371307 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 20:11:41.371392 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 20:11:41.373022 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 20:11:41.373121 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 20:11:41.378835 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 20:11:41.380993 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 20:11:41.381094 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 20:11:41.381229 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 13 20:11:41.385163 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 20:11:41.385388 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 20:11:41.386850 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 20:11:41.387073 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 20:11:41.388572 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 20:11:41.388766 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 20:11:41.391599 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 20:11:41.393379 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 20:11:41.393476 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 20:11:41.404079 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 20:11:41.414521 systemd[1]: Switching root. Feb 13 20:11:41.441392 systemd-journald[202]: Journal stopped Feb 13 20:11:43.040104 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Feb 13 20:11:43.040263 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 20:11:43.040301 kernel: SELinux: policy capability open_perms=1 Feb 13 20:11:43.040323 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 20:11:43.040344 kernel: SELinux: policy capability always_check_network=0 Feb 13 20:11:43.040364 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 20:11:43.040399 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 20:11:43.040420 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 20:11:43.040440 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 20:11:43.040478 kernel: audit: type=1403 audit(1739477501.666:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 20:11:43.040510 systemd[1]: Successfully loaded SELinux policy in 54.426ms. Feb 13 20:11:43.040545 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 23.312ms. Feb 13 20:11:43.040584 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 20:11:43.040608 systemd[1]: Detected virtualization kvm. Feb 13 20:11:43.040629 systemd[1]: Detected architecture x86-64. Feb 13 20:11:43.040663 systemd[1]: Detected first boot. Feb 13 20:11:43.040686 systemd[1]: Hostname set to . Feb 13 20:11:43.040721 systemd[1]: Initializing machine ID from VM UUID. Feb 13 20:11:43.040745 zram_generator::config[1061]: No configuration found. Feb 13 20:11:43.040773 kernel: Guest personality initialized and is inactive Feb 13 20:11:43.040795 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 20:11:43.040816 kernel: Initialized host personality Feb 13 20:11:43.040842 kernel: NET: Registered PF_VSOCK protocol family Feb 13 20:11:43.040864 systemd[1]: Populated /etc with preset unit settings. Feb 13 20:11:43.040887 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Feb 13 20:11:43.040935 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 20:11:43.040984 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 20:11:43.041011 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 20:11:43.041032 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 20:11:43.041061 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 20:11:43.041082 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 20:11:43.041110 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 20:11:43.041133 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 20:11:43.041163 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 20:11:43.041197 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 20:11:43.041219 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 20:11:43.041247 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 20:11:43.041282 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 20:11:43.041304 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 20:11:43.041324 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 20:11:43.041346 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 20:11:43.041381 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 20:11:43.041406 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 20:11:43.041428 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 20:11:43.041449 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 20:11:43.041470 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 20:11:43.041492 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 20:11:43.041513 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 20:11:43.041535 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 20:11:43.041570 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 20:11:43.041594 systemd[1]: Reached target slices.target - Slice Units. Feb 13 20:11:43.041615 systemd[1]: Reached target swap.target - Swaps. Feb 13 20:11:43.041636 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 20:11:43.041669 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 20:11:43.041692 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Feb 13 20:11:43.041713 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 20:11:43.041741 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 20:11:43.041764 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 20:11:43.041786 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 20:11:43.041821 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 20:11:43.041846 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 20:11:43.041906 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 20:11:43.041947 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 20:11:43.041971 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 20:11:43.042005 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 20:11:43.042029 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 20:11:43.042052 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 20:11:43.042073 systemd[1]: Reached target machines.target - Containers. Feb 13 20:11:43.042094 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 20:11:43.042124 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 20:11:43.042147 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 20:11:43.042175 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 20:11:43.042209 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 20:11:43.042232 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 20:11:43.042254 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 20:11:43.042275 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 20:11:43.042296 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 20:11:43.042318 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 20:11:43.042340 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 20:11:43.042362 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 20:11:43.042384 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 20:11:43.042418 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 20:11:43.042442 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 20:11:43.042464 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 20:11:43.042485 kernel: fuse: init (API version 7.39) Feb 13 20:11:43.042506 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 20:11:43.042534 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 20:11:43.042557 kernel: ACPI: bus type drm_connector registered Feb 13 20:11:43.042578 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 20:11:43.042599 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Feb 13 20:11:43.042635 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 20:11:43.042669 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 20:11:43.042693 systemd[1]: Stopped verity-setup.service. Feb 13 20:11:43.042716 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 20:11:43.042750 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 20:11:43.042787 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 20:11:43.042811 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 20:11:43.042832 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 20:11:43.042853 kernel: loop: module loaded Feb 13 20:11:43.042874 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 20:11:43.042948 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 20:11:43.042974 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 20:11:43.042996 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 20:11:43.043017 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 20:11:43.043038 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 20:11:43.043067 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 20:11:43.043090 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 20:11:43.043111 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 20:11:43.043188 systemd-journald[1155]: Collecting audit messages is disabled. Feb 13 20:11:43.043236 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 20:11:43.043261 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 20:11:43.043283 systemd-journald[1155]: Journal started Feb 13 20:11:43.043360 systemd-journald[1155]: Runtime Journal (/run/log/journal/c37954bb756c4470afe9a6bf496f3c9d) is 4.7M, max 37.9M, 33.2M free. Feb 13 20:11:42.576033 systemd[1]: Queued start job for default target multi-user.target. Feb 13 20:11:42.588837 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Feb 13 20:11:42.589754 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 20:11:43.047961 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 20:11:43.052938 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 20:11:43.054979 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 20:11:43.055318 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 20:11:43.057432 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 20:11:43.057734 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 20:11:43.058995 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 20:11:43.061247 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 20:11:43.062630 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 20:11:43.078031 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Feb 13 20:11:43.086966 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 20:11:43.102029 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 20:11:43.115975 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 20:11:43.117998 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 20:11:43.118078 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 20:11:43.121855 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Feb 13 20:11:43.132196 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 20:11:43.139296 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 20:11:43.140257 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 20:11:43.142482 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 20:11:43.154042 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 20:11:43.154929 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 20:11:43.157234 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 20:11:43.159864 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 20:11:43.171029 systemd-journald[1155]: Time spent on flushing to /var/log/journal/c37954bb756c4470afe9a6bf496f3c9d is 24.909ms for 1133 entries. Feb 13 20:11:43.171029 systemd-journald[1155]: System Journal (/var/log/journal/c37954bb756c4470afe9a6bf496f3c9d) is 8M, max 584.8M, 576.8M free. Feb 13 20:11:43.219496 systemd-journald[1155]: Received client request to flush runtime journal. Feb 13 20:11:43.174112 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 20:11:43.180353 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 20:11:43.183639 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 20:11:43.187640 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 20:11:43.188577 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 20:11:43.208959 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 20:11:43.224012 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 20:11:43.252406 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 20:11:43.255136 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 20:11:43.270146 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Feb 13 20:11:43.298033 kernel: loop0: detected capacity change from 0 to 147912 Feb 13 20:11:43.347553 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 20:11:43.363460 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Feb 13 20:11:43.374788 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 20:11:43.379595 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 20:11:43.387092 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 20:11:43.405602 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 20:11:43.411916 kernel: loop1: detected capacity change from 0 to 8 Feb 13 20:11:43.415211 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 20:11:43.426679 udevadm[1218]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Feb 13 20:11:43.447919 kernel: loop2: detected capacity change from 0 to 205544 Feb 13 20:11:43.464098 systemd-tmpfiles[1220]: ACLs are not supported, ignoring. Feb 13 20:11:43.464129 systemd-tmpfiles[1220]: ACLs are not supported, ignoring. Feb 13 20:11:43.491651 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 20:11:43.518627 kernel: loop3: detected capacity change from 0 to 138176 Feb 13 20:11:43.586187 kernel: loop4: detected capacity change from 0 to 147912 Feb 13 20:11:43.593323 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 20:11:43.621944 kernel: loop5: detected capacity change from 0 to 8 Feb 13 20:11:43.629926 kernel: loop6: detected capacity change from 0 to 205544 Feb 13 20:11:43.660113 kernel: loop7: detected capacity change from 0 to 138176 Feb 13 20:11:43.694301 (sd-merge)[1226]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Feb 13 20:11:43.695850 (sd-merge)[1226]: Merged extensions into '/usr'. Feb 13 20:11:43.703598 systemd[1]: Reload requested from client PID 1200 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 20:11:43.703821 systemd[1]: Reloading... Feb 13 20:11:43.943982 zram_generator::config[1255]: No configuration found. Feb 13 20:11:43.968966 ldconfig[1195]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 20:11:44.187677 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 20:11:44.295426 systemd[1]: Reloading finished in 590 ms. Feb 13 20:11:44.318326 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 20:11:44.319690 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 20:11:44.320959 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 20:11:44.337772 systemd[1]: Starting ensure-sysext.service... Feb 13 20:11:44.341203 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 20:11:44.346727 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 20:11:44.381050 systemd[1]: Reload requested from client PID 1311 ('systemctl') (unit ensure-sysext.service)... Feb 13 20:11:44.381080 systemd[1]: Reloading... Feb 13 20:11:44.421789 systemd-udevd[1313]: Using default interface naming scheme 'v255'. Feb 13 20:11:44.435239 systemd-tmpfiles[1312]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 20:11:44.435723 systemd-tmpfiles[1312]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 20:11:44.438872 systemd-tmpfiles[1312]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 20:11:44.439353 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Feb 13 20:11:44.439480 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Feb 13 20:11:44.453353 systemd-tmpfiles[1312]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 20:11:44.453383 systemd-tmpfiles[1312]: Skipping /boot Feb 13 20:11:44.519127 systemd-tmpfiles[1312]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 20:11:44.519149 systemd-tmpfiles[1312]: Skipping /boot Feb 13 20:11:44.562935 zram_generator::config[1358]: No configuration found. Feb 13 20:11:44.677929 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1339) Feb 13 20:11:44.810967 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 20:11:44.826280 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 20:11:44.837941 kernel: ACPI: button: Power Button [PWRF] Feb 13 20:11:44.905945 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Feb 13 20:11:44.915150 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Feb 13 20:11:44.915491 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Feb 13 20:11:44.917989 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 20:11:44.945925 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Feb 13 20:11:45.123680 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 20:11:45.124230 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 20:11:45.127821 systemd[1]: Reloading finished in 746 ms. Feb 13 20:11:45.153213 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 20:11:45.194354 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 20:11:45.240935 systemd[1]: Finished ensure-sysext.service. Feb 13 20:11:45.248151 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 20:11:45.273241 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 20:11:45.279191 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 20:11:45.284997 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 20:11:45.286071 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 20:11:45.289131 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 20:11:45.299704 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 20:11:45.307070 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 20:11:45.325200 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 20:11:45.329927 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 20:11:45.330872 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 20:11:45.334212 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 20:11:45.336938 lvm[1427]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 20:11:45.337012 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 20:11:45.338691 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 20:11:45.348105 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 20:11:45.359244 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 20:11:45.366242 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 20:11:45.369709 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 20:11:45.378107 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 20:11:45.378907 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 20:11:45.380760 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 20:11:45.381796 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 20:11:45.386290 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 20:11:45.386600 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 20:11:45.387798 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 20:11:45.388115 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 20:11:45.391500 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 20:11:45.391776 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 20:11:45.397208 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 20:11:45.397319 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 20:11:45.405202 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 20:11:45.420853 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 20:11:45.440974 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 20:11:45.445013 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 20:11:45.447729 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 20:11:45.452720 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 20:11:45.471944 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 20:11:45.487161 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 20:11:45.504265 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 20:11:45.520499 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 20:11:45.522378 lvm[1466]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 20:11:45.522501 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 20:11:45.531353 augenrules[1474]: No rules Feb 13 20:11:45.534509 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 20:11:45.535969 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 20:11:45.559295 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 20:11:45.573304 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 20:11:45.670488 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 20:11:45.716456 systemd-networkd[1441]: lo: Link UP Feb 13 20:11:45.716470 systemd-networkd[1441]: lo: Gained carrier Feb 13 20:11:45.719053 systemd-networkd[1441]: Enumeration completed Feb 13 20:11:45.719208 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 20:11:45.719585 systemd-networkd[1441]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 20:11:45.719592 systemd-networkd[1441]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 20:11:45.721143 systemd-networkd[1441]: eth0: Link UP Feb 13 20:11:45.721157 systemd-networkd[1441]: eth0: Gained carrier Feb 13 20:11:45.721177 systemd-networkd[1441]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 20:11:45.730136 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Feb 13 20:11:45.733770 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 20:11:45.752617 systemd-resolved[1444]: Positive Trust Anchors: Feb 13 20:11:45.754738 systemd-resolved[1444]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 20:11:45.754786 systemd-resolved[1444]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 20:11:45.755461 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 20:11:45.756648 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 20:11:45.759019 systemd-networkd[1441]: eth0: DHCPv4 address 10.244.13.70/30, gateway 10.244.13.69 acquired from 10.244.13.69 Feb 13 20:11:45.762348 systemd-timesyncd[1445]: Network configuration changed, trying to establish connection. Feb 13 20:11:45.771628 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Feb 13 20:11:45.772816 systemd-resolved[1444]: Using system hostname 'srv-cfrgh.gb1.brightbox.com'. Feb 13 20:11:45.776213 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 20:11:45.777170 systemd[1]: Reached target network.target - Network. Feb 13 20:11:45.778006 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 20:11:45.778912 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 20:11:45.779743 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 20:11:45.780692 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 20:11:45.781765 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 20:11:45.782721 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 20:11:45.783570 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 20:11:45.784362 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 20:11:45.784411 systemd[1]: Reached target paths.target - Path Units. Feb 13 20:11:45.785101 systemd[1]: Reached target timers.target - Timer Units. Feb 13 20:11:45.787105 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 20:11:45.789995 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 20:11:45.795277 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Feb 13 20:11:45.796365 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Feb 13 20:11:45.797191 systemd[1]: Reached target ssh-access.target - SSH Access Available. Feb 13 20:11:45.802605 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 20:11:45.803882 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Feb 13 20:11:45.805427 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 20:11:45.806314 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 20:11:45.807031 systemd[1]: Reached target basic.target - Basic System. Feb 13 20:11:45.807765 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 20:11:45.807820 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 20:11:45.813053 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 20:11:45.816134 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Feb 13 20:11:45.823107 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 20:11:45.826806 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 20:11:45.832045 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 20:11:45.832801 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 20:11:45.839141 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 20:11:45.847038 jq[1503]: false Feb 13 20:11:45.849212 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 20:11:45.853630 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 20:11:45.865115 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 20:11:45.869163 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 20:11:45.870015 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 20:11:45.873152 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 20:11:45.882049 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 20:11:45.891429 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 20:11:45.891811 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 20:11:45.910705 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 20:11:45.911078 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 20:11:45.920604 dbus-daemon[1501]: [system] SELinux support is enabled Feb 13 20:11:45.925203 extend-filesystems[1505]: Found loop4 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found loop5 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found loop6 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found loop7 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda1 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda2 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda3 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found usr Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda4 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda6 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda7 Feb 13 20:11:45.925203 extend-filesystems[1505]: Found vda9 Feb 13 20:11:45.925203 extend-filesystems[1505]: Checking size of /dev/vda9 Feb 13 20:11:45.923191 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 20:11:46.796813 jq[1512]: true Feb 13 20:11:45.945208 dbus-daemon[1501]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1441 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Feb 13 20:11:45.928499 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 20:11:46.761735 dbus-daemon[1501]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 20:11:45.928547 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 20:11:45.929455 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 20:11:45.929489 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 20:11:46.761284 systemd-resolved[1444]: Clock change detected. Flushing caches. Feb 13 20:11:46.761459 systemd-timesyncd[1445]: Contacted time server 193.57.144.50:123 (0.flatcar.pool.ntp.org). Feb 13 20:11:46.761632 systemd-timesyncd[1445]: Initial clock synchronization to Thu 2025-02-13 20:11:46.760874 UTC. Feb 13 20:11:46.767047 (ntainerd)[1518]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 20:11:46.775731 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Feb 13 20:11:46.811108 extend-filesystems[1505]: Resized partition /dev/vda9 Feb 13 20:11:46.815252 update_engine[1511]: I20250213 20:11:46.800555 1511 main.cc:92] Flatcar Update Engine starting Feb 13 20:11:46.799845 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 20:11:46.817763 extend-filesystems[1536]: resize2fs 1.47.1 (20-May-2024) Feb 13 20:11:46.829909 update_engine[1511]: I20250213 20:11:46.816058 1511 update_check_scheduler.cc:74] Next update check in 6m37s Feb 13 20:11:46.800649 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 20:11:46.830043 jq[1527]: true Feb 13 20:11:46.813847 systemd[1]: Started update-engine.service - Update Engine. Feb 13 20:11:46.826714 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 20:11:46.841402 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Feb 13 20:11:46.998507 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1336) Feb 13 20:11:47.069646 systemd-logind[1509]: Watching system buttons on /dev/input/event2 (Power Button) Feb 13 20:11:47.069715 systemd-logind[1509]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 20:11:47.070237 systemd-logind[1509]: New seat seat0. Feb 13 20:11:47.077223 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 20:11:47.087932 bash[1557]: Updated "/home/core/.ssh/authorized_keys" Feb 13 20:11:47.089458 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 20:11:47.099914 systemd[1]: Starting sshkeys.service... Feb 13 20:11:47.133301 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Feb 13 20:11:47.144928 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Feb 13 20:11:47.202699 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Feb 13 20:11:47.236324 extend-filesystems[1536]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Feb 13 20:11:47.236324 extend-filesystems[1536]: old_desc_blocks = 1, new_desc_blocks = 8 Feb 13 20:11:47.236324 extend-filesystems[1536]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Feb 13 20:11:47.235053 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 20:11:47.248005 containerd[1518]: time="2025-02-13T20:11:47.234688145Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 20:11:47.248331 extend-filesystems[1505]: Resized filesystem in /dev/vda9 Feb 13 20:11:47.235429 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 20:11:47.267609 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Feb 13 20:11:47.269731 dbus-daemon[1501]: [system] Successfully activated service 'org.freedesktop.hostname1' Feb 13 20:11:47.272789 dbus-daemon[1501]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1531 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Feb 13 20:11:47.284905 systemd[1]: Starting polkit.service - Authorization Manager... Feb 13 20:11:47.311543 locksmithd[1537]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 20:11:47.319620 polkitd[1573]: Started polkitd version 121 Feb 13 20:11:47.327499 containerd[1518]: time="2025-02-13T20:11:47.325854800Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.328376 containerd[1518]: time="2025-02-13T20:11:47.328314186Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 20:11:47.328430 containerd[1518]: time="2025-02-13T20:11:47.328374060Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 20:11:47.328430 containerd[1518]: time="2025-02-13T20:11:47.328400281Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 20:11:47.328741 containerd[1518]: time="2025-02-13T20:11:47.328714876Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 20:11:47.328799 containerd[1518]: time="2025-02-13T20:11:47.328756174Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.328887 containerd[1518]: time="2025-02-13T20:11:47.328860049Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 20:11:47.328929 containerd[1518]: time="2025-02-13T20:11:47.328889340Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.329218 containerd[1518]: time="2025-02-13T20:11:47.329172876Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 20:11:47.329305 containerd[1518]: time="2025-02-13T20:11:47.329218837Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.329305 containerd[1518]: time="2025-02-13T20:11:47.329241542Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 20:11:47.329305 containerd[1518]: time="2025-02-13T20:11:47.329258604Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.329440 containerd[1518]: time="2025-02-13T20:11:47.329414507Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.330318 containerd[1518]: time="2025-02-13T20:11:47.330284465Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 20:11:47.330498 containerd[1518]: time="2025-02-13T20:11:47.330455909Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 20:11:47.330573 containerd[1518]: time="2025-02-13T20:11:47.330514327Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 20:11:47.330748 containerd[1518]: time="2025-02-13T20:11:47.330654028Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 20:11:47.330802 containerd[1518]: time="2025-02-13T20:11:47.330744662Z" level=info msg="metadata content store policy set" policy=shared Feb 13 20:11:47.332447 polkitd[1573]: Loading rules from directory /etc/polkit-1/rules.d Feb 13 20:11:47.332573 polkitd[1573]: Loading rules from directory /usr/share/polkit-1/rules.d Feb 13 20:11:47.333279 polkitd[1573]: Finished loading, compiling and executing 2 rules Feb 13 20:11:47.333949 dbus-daemon[1501]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Feb 13 20:11:47.334298 systemd[1]: Started polkit.service - Authorization Manager. Feb 13 20:11:47.335269 polkitd[1573]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.336697449Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.336785966Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.336813737Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.336837485Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.336859576Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.337041910Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 20:11:47.337514 containerd[1518]: time="2025-02-13T20:11:47.337314105Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 20:11:47.337857 containerd[1518]: time="2025-02-13T20:11:47.337828191Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 20:11:47.337956 containerd[1518]: time="2025-02-13T20:11:47.337932875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 20:11:47.338063 containerd[1518]: time="2025-02-13T20:11:47.338039633Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 20:11:47.338172 containerd[1518]: time="2025-02-13T20:11:47.338148007Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.338297 containerd[1518]: time="2025-02-13T20:11:47.338274242Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.338421 containerd[1518]: time="2025-02-13T20:11:47.338395586Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.338565 containerd[1518]: time="2025-02-13T20:11:47.338540517Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.338683 containerd[1518]: time="2025-02-13T20:11:47.338656853Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.338786 containerd[1518]: time="2025-02-13T20:11:47.338762638Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.338880316Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.338908365Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.338951196Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.338975212Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339001058Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339023725Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339043683Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339062875Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339081669Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339101047Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339151881Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339190487Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339212526Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.339518 containerd[1518]: time="2025-02-13T20:11:47.339231628Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.340146 containerd[1518]: time="2025-02-13T20:11:47.339252810Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.340146 containerd[1518]: time="2025-02-13T20:11:47.339273878Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 20:11:47.340146 containerd[1518]: time="2025-02-13T20:11:47.339313210Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.340146 containerd[1518]: time="2025-02-13T20:11:47.339358831Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.340146 containerd[1518]: time="2025-02-13T20:11:47.339382818Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.339484639Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340617485Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340765242Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340808922Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340830613Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340870489Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340899255Z" level=info msg="NRI interface is disabled by configuration." Feb 13 20:11:47.341510 containerd[1518]: time="2025-02-13T20:11:47.340921111Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 20:11:47.341836 containerd[1518]: time="2025-02-13T20:11:47.341329039Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 20:11:47.341836 containerd[1518]: time="2025-02-13T20:11:47.341404427Z" level=info msg="Connect containerd service" Feb 13 20:11:47.344097 containerd[1518]: time="2025-02-13T20:11:47.341470990Z" level=info msg="using legacy CRI server" Feb 13 20:11:47.344097 containerd[1518]: time="2025-02-13T20:11:47.342433018Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 20:11:47.344097 containerd[1518]: time="2025-02-13T20:11:47.343611809Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 20:11:47.344956 containerd[1518]: time="2025-02-13T20:11:47.344886193Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346306805Z" level=info msg="Start subscribing containerd event" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346401311Z" level=info msg="Start recovering state" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346571439Z" level=info msg="Start event monitor" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346610096Z" level=info msg="Start snapshots syncer" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346627808Z" level=info msg="Start cni network conf syncer for default" Feb 13 20:11:47.347228 containerd[1518]: time="2025-02-13T20:11:47.346643088Z" level=info msg="Start streaming server" Feb 13 20:11:47.348273 containerd[1518]: time="2025-02-13T20:11:47.348243564Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 20:11:47.348459 containerd[1518]: time="2025-02-13T20:11:47.348433881Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 20:11:47.348811 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 20:11:47.350856 containerd[1518]: time="2025-02-13T20:11:47.350827175Z" level=info msg="containerd successfully booted in 0.133431s" Feb 13 20:11:47.365735 systemd-hostnamed[1531]: Hostname set to (static) Feb 13 20:11:47.432202 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 20:11:47.503087 sshd_keygen[1532]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 20:11:47.531995 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 20:11:47.539967 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 20:11:47.543931 systemd[1]: Started sshd@0-10.244.13.70:22-139.178.89.65:34194.service - OpenSSH per-connection server daemon (139.178.89.65:34194). Feb 13 20:11:47.557519 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 20:11:47.558191 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 20:11:47.569183 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 20:11:47.598107 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 20:11:47.614223 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 20:11:47.617852 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 20:11:47.618965 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 20:11:47.957872 systemd-networkd[1441]: eth0: Gained IPv6LL Feb 13 20:11:47.960761 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 20:11:47.963293 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 20:11:47.970863 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:11:47.973921 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 20:11:48.019011 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 20:11:48.484124 sshd[1594]: Accepted publickey for core from 139.178.89.65 port 34194 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:11:48.487466 sshd-session[1594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:11:48.509891 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 20:11:48.519043 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 20:11:48.528766 systemd-networkd[1441]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:351:24:19ff:fef4:d46/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:351:24:19ff:fef4:d46/64 assigned by NDisc. Feb 13 20:11:48.528781 systemd-networkd[1441]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Feb 13 20:11:48.531587 systemd-logind[1509]: New session 1 of user core. Feb 13 20:11:48.548196 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 20:11:48.563284 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 20:11:48.571665 (systemd)[1619]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 20:11:48.577734 systemd-logind[1509]: New session c1 of user core. Feb 13 20:11:48.777850 systemd[1619]: Queued start job for default target default.target. Feb 13 20:11:48.785563 systemd[1619]: Created slice app.slice - User Application Slice. Feb 13 20:11:48.785611 systemd[1619]: Reached target paths.target - Paths. Feb 13 20:11:48.785715 systemd[1619]: Reached target timers.target - Timers. Feb 13 20:11:48.789119 systemd[1619]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 20:11:48.815953 systemd[1619]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 20:11:48.816188 systemd[1619]: Reached target sockets.target - Sockets. Feb 13 20:11:48.816265 systemd[1619]: Reached target basic.target - Basic System. Feb 13 20:11:48.816371 systemd[1619]: Reached target default.target - Main User Target. Feb 13 20:11:48.816436 systemd[1619]: Startup finished in 226ms. Feb 13 20:11:48.816750 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 20:11:48.828870 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 20:11:48.964284 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:11:48.970925 (kubelet)[1633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 20:11:49.475344 systemd[1]: Started sshd@1-10.244.13.70:22-139.178.89.65:34196.service - OpenSSH per-connection server daemon (139.178.89.65:34196). Feb 13 20:11:49.624890 kubelet[1633]: E0213 20:11:49.624761 1633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 20:11:49.628165 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 20:11:49.628452 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 20:11:49.629122 systemd[1]: kubelet.service: Consumed 1.017s CPU time, 234.7M memory peak. Feb 13 20:11:50.370196 sshd[1641]: Accepted publickey for core from 139.178.89.65 port 34196 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:11:50.372305 sshd-session[1641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:11:50.381528 systemd-logind[1509]: New session 2 of user core. Feb 13 20:11:50.390893 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 20:11:50.991544 sshd[1646]: Connection closed by 139.178.89.65 port 34196 Feb 13 20:11:50.991271 sshd-session[1641]: pam_unix(sshd:session): session closed for user core Feb 13 20:11:50.995742 systemd[1]: sshd@1-10.244.13.70:22-139.178.89.65:34196.service: Deactivated successfully. Feb 13 20:11:50.999583 systemd[1]: session-2.scope: Deactivated successfully. Feb 13 20:11:51.002534 systemd-logind[1509]: Session 2 logged out. Waiting for processes to exit. Feb 13 20:11:51.004891 systemd-logind[1509]: Removed session 2. Feb 13 20:11:51.153098 systemd[1]: Started sshd@2-10.244.13.70:22-139.178.89.65:34212.service - OpenSSH per-connection server daemon (139.178.89.65:34212). Feb 13 20:11:52.044602 sshd[1652]: Accepted publickey for core from 139.178.89.65 port 34212 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:11:52.046841 sshd-session[1652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:11:52.054910 systemd-logind[1509]: New session 3 of user core. Feb 13 20:11:52.068894 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 20:11:52.668502 sshd[1654]: Connection closed by 139.178.89.65 port 34212 Feb 13 20:11:52.671766 sshd-session[1652]: pam_unix(sshd:session): session closed for user core Feb 13 20:11:52.690325 systemd[1]: sshd@2-10.244.13.70:22-139.178.89.65:34212.service: Deactivated successfully. Feb 13 20:11:52.702379 systemd[1]: session-3.scope: Deactivated successfully. Feb 13 20:11:52.707408 systemd-logind[1509]: Session 3 logged out. Waiting for processes to exit. Feb 13 20:11:52.714305 login[1600]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 20:11:52.715386 systemd-logind[1509]: Removed session 3. Feb 13 20:11:52.717841 login[1602]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 20:11:52.725924 systemd-logind[1509]: New session 4 of user core. Feb 13 20:11:52.734798 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 20:11:52.739313 systemd-logind[1509]: New session 5 of user core. Feb 13 20:11:52.745370 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 20:11:53.760961 coreos-metadata[1500]: Feb 13 20:11:53.760 WARN failed to locate config-drive, using the metadata service API instead Feb 13 20:11:53.789566 coreos-metadata[1500]: Feb 13 20:11:53.789 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Feb 13 20:11:53.796317 coreos-metadata[1500]: Feb 13 20:11:53.796 INFO Fetch failed with 404: resource not found Feb 13 20:11:53.796317 coreos-metadata[1500]: Feb 13 20:11:53.796 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Feb 13 20:11:53.797117 coreos-metadata[1500]: Feb 13 20:11:53.797 INFO Fetch successful Feb 13 20:11:53.797376 coreos-metadata[1500]: Feb 13 20:11:53.797 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Feb 13 20:11:53.809897 coreos-metadata[1500]: Feb 13 20:11:53.809 INFO Fetch successful Feb 13 20:11:53.810410 coreos-metadata[1500]: Feb 13 20:11:53.810 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Feb 13 20:11:53.823940 coreos-metadata[1500]: Feb 13 20:11:53.823 INFO Fetch successful Feb 13 20:11:53.824178 coreos-metadata[1500]: Feb 13 20:11:53.824 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Feb 13 20:11:53.838508 coreos-metadata[1500]: Feb 13 20:11:53.838 INFO Fetch successful Feb 13 20:11:53.838729 coreos-metadata[1500]: Feb 13 20:11:53.838 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Feb 13 20:11:53.856913 coreos-metadata[1500]: Feb 13 20:11:53.856 INFO Fetch successful Feb 13 20:11:53.903958 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Feb 13 20:11:53.905304 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 20:11:54.308662 coreos-metadata[1561]: Feb 13 20:11:54.308 WARN failed to locate config-drive, using the metadata service API instead Feb 13 20:11:54.332297 coreos-metadata[1561]: Feb 13 20:11:54.332 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Feb 13 20:11:54.354950 coreos-metadata[1561]: Feb 13 20:11:54.354 INFO Fetch successful Feb 13 20:11:54.355243 coreos-metadata[1561]: Feb 13 20:11:54.355 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Feb 13 20:11:54.380984 coreos-metadata[1561]: Feb 13 20:11:54.380 INFO Fetch successful Feb 13 20:11:54.383144 unknown[1561]: wrote ssh authorized keys file for user: core Feb 13 20:11:54.401984 update-ssh-keys[1694]: Updated "/home/core/.ssh/authorized_keys" Feb 13 20:11:54.402799 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Feb 13 20:11:54.405870 systemd[1]: Finished sshkeys.service. Feb 13 20:11:54.409881 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 20:11:54.410693 systemd[1]: Startup finished in 1.462s (kernel) + 13.907s (initrd) + 11.982s (userspace) = 27.352s. Feb 13 20:11:59.764111 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 20:11:59.773857 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:11:59.946862 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:11:59.952847 (kubelet)[1706]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 20:12:00.015294 kubelet[1706]: E0213 20:12:00.015088 1706 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 20:12:00.018781 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 20:12:00.019040 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 20:12:00.019744 systemd[1]: kubelet.service: Consumed 226ms CPU time, 97.8M memory peak. Feb 13 20:12:02.825849 systemd[1]: Started sshd@3-10.244.13.70:22-139.178.89.65:41194.service - OpenSSH per-connection server daemon (139.178.89.65:41194). Feb 13 20:12:03.718365 sshd[1713]: Accepted publickey for core from 139.178.89.65 port 41194 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:03.720348 sshd-session[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:03.727870 systemd-logind[1509]: New session 6 of user core. Feb 13 20:12:03.734777 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 20:12:04.331211 sshd[1715]: Connection closed by 139.178.89.65 port 41194 Feb 13 20:12:04.332249 sshd-session[1713]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:04.337219 systemd-logind[1509]: Session 6 logged out. Waiting for processes to exit. Feb 13 20:12:04.338416 systemd[1]: sshd@3-10.244.13.70:22-139.178.89.65:41194.service: Deactivated successfully. Feb 13 20:12:04.341071 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 20:12:04.342915 systemd-logind[1509]: Removed session 6. Feb 13 20:12:04.496860 systemd[1]: Started sshd@4-10.244.13.70:22-139.178.89.65:41198.service - OpenSSH per-connection server daemon (139.178.89.65:41198). Feb 13 20:12:05.386300 sshd[1721]: Accepted publickey for core from 139.178.89.65 port 41198 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:05.388232 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:05.395599 systemd-logind[1509]: New session 7 of user core. Feb 13 20:12:05.402710 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 20:12:05.999542 sshd[1723]: Connection closed by 139.178.89.65 port 41198 Feb 13 20:12:06.000460 sshd-session[1721]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:06.005342 systemd-logind[1509]: Session 7 logged out. Waiting for processes to exit. Feb 13 20:12:06.006661 systemd[1]: sshd@4-10.244.13.70:22-139.178.89.65:41198.service: Deactivated successfully. Feb 13 20:12:06.009317 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 20:12:06.010855 systemd-logind[1509]: Removed session 7. Feb 13 20:12:06.159922 systemd[1]: Started sshd@5-10.244.13.70:22-139.178.89.65:49054.service - OpenSSH per-connection server daemon (139.178.89.65:49054). Feb 13 20:12:07.049117 sshd[1729]: Accepted publickey for core from 139.178.89.65 port 49054 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:07.051090 sshd-session[1729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:07.059795 systemd-logind[1509]: New session 8 of user core. Feb 13 20:12:07.067691 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 20:12:07.667511 sshd[1731]: Connection closed by 139.178.89.65 port 49054 Feb 13 20:12:07.666659 sshd-session[1729]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:07.671054 systemd[1]: sshd@5-10.244.13.70:22-139.178.89.65:49054.service: Deactivated successfully. Feb 13 20:12:07.673280 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 20:12:07.674983 systemd-logind[1509]: Session 8 logged out. Waiting for processes to exit. Feb 13 20:12:07.676527 systemd-logind[1509]: Removed session 8. Feb 13 20:12:07.832921 systemd[1]: Started sshd@6-10.244.13.70:22-139.178.89.65:49060.service - OpenSSH per-connection server daemon (139.178.89.65:49060). Feb 13 20:12:08.725777 sshd[1737]: Accepted publickey for core from 139.178.89.65 port 49060 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:08.728122 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:08.736873 systemd-logind[1509]: New session 9 of user core. Feb 13 20:12:08.742820 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 20:12:09.234547 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 20:12:09.235045 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 20:12:09.254161 sudo[1740]: pam_unix(sudo:session): session closed for user root Feb 13 20:12:09.398560 sshd[1739]: Connection closed by 139.178.89.65 port 49060 Feb 13 20:12:09.399723 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:09.404290 systemd[1]: sshd@6-10.244.13.70:22-139.178.89.65:49060.service: Deactivated successfully. Feb 13 20:12:09.406819 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 20:12:09.409141 systemd-logind[1509]: Session 9 logged out. Waiting for processes to exit. Feb 13 20:12:09.410902 systemd-logind[1509]: Removed session 9. Feb 13 20:12:09.566995 systemd[1]: Started sshd@7-10.244.13.70:22-139.178.89.65:49062.service - OpenSSH per-connection server daemon (139.178.89.65:49062). Feb 13 20:12:10.263792 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 20:12:10.275877 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:12:10.417072 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:12:10.423463 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 20:12:10.452872 sshd[1746]: Accepted publickey for core from 139.178.89.65 port 49062 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:10.455450 sshd-session[1746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:10.463949 systemd-logind[1509]: New session 10 of user core. Feb 13 20:12:10.471859 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 20:12:10.526523 kubelet[1756]: E0213 20:12:10.526124 1756 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 20:12:10.529435 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 20:12:10.529952 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 20:12:10.530986 systemd[1]: kubelet.service: Consumed 194ms CPU time, 98.2M memory peak. Feb 13 20:12:10.934697 sudo[1765]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 20:12:10.935279 sudo[1765]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 20:12:10.942556 sudo[1765]: pam_unix(sudo:session): session closed for user root Feb 13 20:12:10.951880 sudo[1764]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 20:12:10.952348 sudo[1764]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 20:12:10.979998 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 20:12:11.026007 augenrules[1787]: No rules Feb 13 20:12:11.027115 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 20:12:11.027576 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 20:12:11.029604 sudo[1764]: pam_unix(sudo:session): session closed for user root Feb 13 20:12:11.171931 sshd[1761]: Connection closed by 139.178.89.65 port 49062 Feb 13 20:12:11.173442 sshd-session[1746]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:11.178567 systemd[1]: sshd@7-10.244.13.70:22-139.178.89.65:49062.service: Deactivated successfully. Feb 13 20:12:11.181306 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 20:12:11.182572 systemd-logind[1509]: Session 10 logged out. Waiting for processes to exit. Feb 13 20:12:11.185152 systemd-logind[1509]: Removed session 10. Feb 13 20:12:11.332950 systemd[1]: Started sshd@8-10.244.13.70:22-139.178.89.65:49070.service - OpenSSH per-connection server daemon (139.178.89.65:49070). Feb 13 20:12:12.218846 sshd[1796]: Accepted publickey for core from 139.178.89.65 port 49070 ssh2: RSA SHA256:1d/NPWzJh4p1csN6rw9jx6l57+TZuIaUuHeQZhkXldk Feb 13 20:12:12.220679 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 20:12:12.227944 systemd-logind[1509]: New session 11 of user core. Feb 13 20:12:12.238747 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 20:12:12.693509 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 20:12:12.694641 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 20:12:13.417094 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:12:13.417392 systemd[1]: kubelet.service: Consumed 194ms CPU time, 98.2M memory peak. Feb 13 20:12:13.435303 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:12:13.484269 systemd[1]: Reload requested from client PID 1831 ('systemctl') (unit session-11.scope)... Feb 13 20:12:13.484319 systemd[1]: Reloading... Feb 13 20:12:13.667883 zram_generator::config[1878]: No configuration found. Feb 13 20:12:13.861184 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 20:12:14.017862 systemd[1]: Reloading finished in 532 ms. Feb 13 20:12:14.093232 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:12:14.099594 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:12:14.102178 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 20:12:14.102554 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:12:14.102612 systemd[1]: kubelet.service: Consumed 149ms CPU time, 83.4M memory peak. Feb 13 20:12:14.108897 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 20:12:14.259972 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 20:12:14.269117 (kubelet)[1946]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 20:12:14.360970 kubelet[1946]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 20:12:14.360970 kubelet[1946]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 20:12:14.360970 kubelet[1946]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 20:12:14.362230 kubelet[1946]: I0213 20:12:14.361027 1946 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 20:12:15.430873 kubelet[1946]: I0213 20:12:15.430774 1946 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Feb 13 20:12:15.430873 kubelet[1946]: I0213 20:12:15.430847 1946 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 20:12:15.431750 kubelet[1946]: I0213 20:12:15.431214 1946 server.go:929] "Client rotation is on, will bootstrap in background" Feb 13 20:12:15.456322 kubelet[1946]: I0213 20:12:15.456229 1946 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 20:12:15.474254 kubelet[1946]: E0213 20:12:15.474133 1946 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 20:12:15.474254 kubelet[1946]: I0213 20:12:15.474200 1946 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 20:12:15.486889 kubelet[1946]: I0213 20:12:15.486831 1946 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 20:12:15.488922 kubelet[1946]: I0213 20:12:15.488876 1946 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 13 20:12:15.489360 kubelet[1946]: I0213 20:12:15.489292 1946 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 20:12:15.489705 kubelet[1946]: I0213 20:12:15.489342 1946 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"10.244.13.70","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 20:12:15.490038 kubelet[1946]: I0213 20:12:15.489737 1946 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 20:12:15.490038 kubelet[1946]: I0213 20:12:15.489771 1946 container_manager_linux.go:300] "Creating device plugin manager" Feb 13 20:12:15.490257 kubelet[1946]: I0213 20:12:15.490223 1946 state_mem.go:36] "Initialized new in-memory state store" Feb 13 20:12:15.492442 kubelet[1946]: I0213 20:12:15.491831 1946 kubelet.go:408] "Attempting to sync node with API server" Feb 13 20:12:15.492442 kubelet[1946]: I0213 20:12:15.491865 1946 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 20:12:15.492442 kubelet[1946]: I0213 20:12:15.491932 1946 kubelet.go:314] "Adding apiserver pod source" Feb 13 20:12:15.492442 kubelet[1946]: I0213 20:12:15.491980 1946 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 20:12:15.496512 kubelet[1946]: E0213 20:12:15.495323 1946 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:15.496512 kubelet[1946]: E0213 20:12:15.495437 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:15.501201 kubelet[1946]: I0213 20:12:15.501155 1946 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 20:12:15.503935 kubelet[1946]: I0213 20:12:15.503893 1946 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 20:12:15.505174 kubelet[1946]: W0213 20:12:15.505148 1946 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 20:12:15.506949 kubelet[1946]: I0213 20:12:15.506912 1946 server.go:1269] "Started kubelet" Feb 13 20:12:15.509335 kubelet[1946]: I0213 20:12:15.508848 1946 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 20:12:15.510716 kubelet[1946]: I0213 20:12:15.510690 1946 server.go:460] "Adding debug handlers to kubelet server" Feb 13 20:12:15.513561 kubelet[1946]: I0213 20:12:15.512932 1946 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 20:12:15.513561 kubelet[1946]: I0213 20:12:15.513459 1946 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 20:12:15.516110 kubelet[1946]: I0213 20:12:15.515185 1946 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 20:12:15.516547 kubelet[1946]: W0213 20:12:15.516519 1946 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 20:12:15.516767 kubelet[1946]: E0213 20:12:15.516723 1946 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 13 20:12:15.516960 kubelet[1946]: W0213 20:12:15.516910 1946 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "10.244.13.70" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 20:12:15.517286 kubelet[1946]: E0213 20:12:15.517260 1946 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"10.244.13.70\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 13 20:12:15.518079 kubelet[1946]: I0213 20:12:15.518056 1946 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 20:12:15.518914 kubelet[1946]: I0213 20:12:15.518891 1946 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 13 20:12:15.519078 kubelet[1946]: E0213 20:12:15.519051 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:15.520777 kubelet[1946]: I0213 20:12:15.520749 1946 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 13 20:12:15.520913 kubelet[1946]: I0213 20:12:15.520892 1946 reconciler.go:26] "Reconciler: start to sync state" Feb 13 20:12:15.524816 kubelet[1946]: I0213 20:12:15.524792 1946 factory.go:221] Registration of the systemd container factory successfully Feb 13 20:12:15.525119 kubelet[1946]: I0213 20:12:15.525087 1946 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 20:12:15.529175 kubelet[1946]: I0213 20:12:15.529144 1946 factory.go:221] Registration of the containerd container factory successfully Feb 13 20:12:15.531449 kubelet[1946]: E0213 20:12:15.531420 1946 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 20:12:15.551074 kubelet[1946]: I0213 20:12:15.550601 1946 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 20:12:15.551074 kubelet[1946]: I0213 20:12:15.550630 1946 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 20:12:15.551074 kubelet[1946]: I0213 20:12:15.550663 1946 state_mem.go:36] "Initialized new in-memory state store" Feb 13 20:12:15.553651 kubelet[1946]: E0213 20:12:15.553612 1946 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"10.244.13.70\" not found" node="10.244.13.70" Feb 13 20:12:15.554292 kubelet[1946]: I0213 20:12:15.554270 1946 policy_none.go:49] "None policy: Start" Feb 13 20:12:15.555294 kubelet[1946]: I0213 20:12:15.555264 1946 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 20:12:15.555442 kubelet[1946]: I0213 20:12:15.555423 1946 state_mem.go:35] "Initializing new in-memory state store" Feb 13 20:12:15.568465 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 20:12:15.593446 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 20:12:15.619190 kubelet[1946]: E0213 20:12:15.619132 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:15.622804 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 20:12:15.627310 kubelet[1946]: I0213 20:12:15.624983 1946 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 20:12:15.627310 kubelet[1946]: I0213 20:12:15.625610 1946 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 20:12:15.627310 kubelet[1946]: I0213 20:12:15.625638 1946 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 20:12:15.628456 kubelet[1946]: I0213 20:12:15.628312 1946 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 20:12:15.631684 kubelet[1946]: E0213 20:12:15.631644 1946 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"10.244.13.70\" not found" Feb 13 20:12:15.650351 kubelet[1946]: I0213 20:12:15.650295 1946 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 20:12:15.652155 kubelet[1946]: I0213 20:12:15.652126 1946 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 20:12:15.652225 kubelet[1946]: I0213 20:12:15.652178 1946 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 20:12:15.652225 kubelet[1946]: I0213 20:12:15.652215 1946 kubelet.go:2321] "Starting kubelet main sync loop" Feb 13 20:12:15.652507 kubelet[1946]: E0213 20:12:15.652423 1946 kubelet.go:2345] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 13 20:12:15.734154 kubelet[1946]: I0213 20:12:15.731357 1946 kubelet_node_status.go:72] "Attempting to register node" node="10.244.13.70" Feb 13 20:12:15.739992 kubelet[1946]: I0213 20:12:15.739770 1946 kubelet_node_status.go:75] "Successfully registered node" node="10.244.13.70" Feb 13 20:12:15.739992 kubelet[1946]: E0213 20:12:15.739819 1946 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"10.244.13.70\": node \"10.244.13.70\" not found" Feb 13 20:12:15.766333 kubelet[1946]: E0213 20:12:15.766215 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:15.817017 sudo[1799]: pam_unix(sudo:session): session closed for user root Feb 13 20:12:15.866714 kubelet[1946]: E0213 20:12:15.866590 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:15.961550 sshd[1798]: Connection closed by 139.178.89.65 port 49070 Feb 13 20:12:15.963035 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Feb 13 20:12:15.967340 kubelet[1946]: E0213 20:12:15.967290 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:15.970539 systemd[1]: sshd@8-10.244.13.70:22-139.178.89.65:49070.service: Deactivated successfully. Feb 13 20:12:15.974034 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 20:12:15.974446 systemd[1]: session-11.scope: Consumed 599ms CPU time, 72.3M memory peak. Feb 13 20:12:15.978251 systemd-logind[1509]: Session 11 logged out. Waiting for processes to exit. Feb 13 20:12:15.980108 systemd-logind[1509]: Removed session 11. Feb 13 20:12:16.068634 kubelet[1946]: E0213 20:12:16.068551 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:16.169423 kubelet[1946]: E0213 20:12:16.169332 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:16.270417 kubelet[1946]: E0213 20:12:16.270311 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:16.371451 kubelet[1946]: E0213 20:12:16.371195 1946 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.244.13.70\" not found" Feb 13 20:12:16.434821 kubelet[1946]: I0213 20:12:16.434747 1946 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 13 20:12:16.435673 kubelet[1946]: W0213 20:12:16.435000 1946 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 20:12:16.435673 kubelet[1946]: W0213 20:12:16.435050 1946 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 20:12:16.435673 kubelet[1946]: W0213 20:12:16.435088 1946 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 20:12:16.473666 kubelet[1946]: I0213 20:12:16.473479 1946 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Feb 13 20:12:16.474576 kubelet[1946]: I0213 20:12:16.474501 1946 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Feb 13 20:12:16.474687 containerd[1518]: time="2025-02-13T20:12:16.474067242Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 20:12:16.497042 kubelet[1946]: I0213 20:12:16.496391 1946 apiserver.go:52] "Watching apiserver" Feb 13 20:12:16.497042 kubelet[1946]: E0213 20:12:16.496881 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:16.513624 kubelet[1946]: E0213 20:12:16.512734 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:16.526995 systemd[1]: Created slice kubepods-besteffort-podaaf8cd5e_ea06_4c63_a6f8_ea6a821d8183.slice - libcontainer container kubepods-besteffort-podaaf8cd5e_ea06_4c63_a6f8_ea6a821d8183.slice. Feb 13 20:12:16.527258 kubelet[1946]: I0213 20:12:16.526480 1946 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 13 20:12:16.529644 kubelet[1946]: I0213 20:12:16.527899 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183-xtables-lock\") pod \"kube-proxy-s672z\" (UID: \"aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183\") " pod="kube-system/kube-proxy-s672z" Feb 13 20:12:16.529644 kubelet[1946]: I0213 20:12:16.527950 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtgj\" (UniqueName: \"kubernetes.io/projected/aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183-kube-api-access-9wtgj\") pod \"kube-proxy-s672z\" (UID: \"aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183\") " pod="kube-system/kube-proxy-s672z" Feb 13 20:12:16.529644 kubelet[1946]: I0213 20:12:16.527986 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c00cac7-d73c-4027-84a1-c306a671c9cf-tigera-ca-bundle\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.529644 kubelet[1946]: I0213 20:12:16.528024 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-var-lib-calico\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.529644 kubelet[1946]: I0213 20:12:16.528053 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-cni-log-dir\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.529968 kubelet[1946]: I0213 20:12:16.528078 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5b9c20a-bc22-40d8-8d45-69955889fccc-kubelet-dir\") pod \"csi-node-driver-fjz82\" (UID: \"d5b9c20a-bc22-40d8-8d45-69955889fccc\") " pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:16.529968 kubelet[1946]: I0213 20:12:16.528103 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183-kube-proxy\") pod \"kube-proxy-s672z\" (UID: \"aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183\") " pod="kube-system/kube-proxy-s672z" Feb 13 20:12:16.529968 kubelet[1946]: I0213 20:12:16.528127 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-lib-modules\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.529968 kubelet[1946]: I0213 20:12:16.528156 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d5b9c20a-bc22-40d8-8d45-69955889fccc-varrun\") pod \"csi-node-driver-fjz82\" (UID: \"d5b9c20a-bc22-40d8-8d45-69955889fccc\") " pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:16.529968 kubelet[1946]: I0213 20:12:16.528184 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvkh\" (UniqueName: \"kubernetes.io/projected/d5b9c20a-bc22-40d8-8d45-69955889fccc-kube-api-access-pdvkh\") pod \"csi-node-driver-fjz82\" (UID: \"d5b9c20a-bc22-40d8-8d45-69955889fccc\") " pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:16.530212 kubelet[1946]: I0213 20:12:16.528220 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183-lib-modules\") pod \"kube-proxy-s672z\" (UID: \"aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183\") " pod="kube-system/kube-proxy-s672z" Feb 13 20:12:16.530212 kubelet[1946]: I0213 20:12:16.528245 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-policysync\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530212 kubelet[1946]: I0213 20:12:16.528306 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-var-run-calico\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530212 kubelet[1946]: I0213 20:12:16.528333 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-cni-bin-dir\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530212 kubelet[1946]: I0213 20:12:16.528368 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zvs\" (UniqueName: \"kubernetes.io/projected/2c00cac7-d73c-4027-84a1-c306a671c9cf-kube-api-access-b9zvs\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530442 kubelet[1946]: I0213 20:12:16.528407 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d5b9c20a-bc22-40d8-8d45-69955889fccc-registration-dir\") pod \"csi-node-driver-fjz82\" (UID: \"d5b9c20a-bc22-40d8-8d45-69955889fccc\") " pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:16.530442 kubelet[1946]: I0213 20:12:16.528462 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-xtables-lock\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530442 kubelet[1946]: I0213 20:12:16.528521 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/2c00cac7-d73c-4027-84a1-c306a671c9cf-node-certs\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530442 kubelet[1946]: I0213 20:12:16.528551 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-cni-net-dir\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.530442 kubelet[1946]: I0213 20:12:16.528576 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/2c00cac7-d73c-4027-84a1-c306a671c9cf-flexvol-driver-host\") pod \"calico-node-79pxv\" (UID: \"2c00cac7-d73c-4027-84a1-c306a671c9cf\") " pod="calico-system/calico-node-79pxv" Feb 13 20:12:16.531246 kubelet[1946]: I0213 20:12:16.528603 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d5b9c20a-bc22-40d8-8d45-69955889fccc-socket-dir\") pod \"csi-node-driver-fjz82\" (UID: \"d5b9c20a-bc22-40d8-8d45-69955889fccc\") " pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:16.554073 systemd[1]: Created slice kubepods-besteffort-pod2c00cac7_d73c_4027_84a1_c306a671c9cf.slice - libcontainer container kubepods-besteffort-pod2c00cac7_d73c_4027_84a1_c306a671c9cf.slice. Feb 13 20:12:16.631842 kubelet[1946]: E0213 20:12:16.631546 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.631842 kubelet[1946]: W0213 20:12:16.631584 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.631842 kubelet[1946]: E0213 20:12:16.631620 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.632803 kubelet[1946]: E0213 20:12:16.632281 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.632803 kubelet[1946]: W0213 20:12:16.632300 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.632803 kubelet[1946]: E0213 20:12:16.632316 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.633246 kubelet[1946]: E0213 20:12:16.633074 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.633246 kubelet[1946]: W0213 20:12:16.633093 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.633246 kubelet[1946]: E0213 20:12:16.633109 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.633629 kubelet[1946]: E0213 20:12:16.633608 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.634517 kubelet[1946]: W0213 20:12:16.633729 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.634517 kubelet[1946]: E0213 20:12:16.633753 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.635124 kubelet[1946]: E0213 20:12:16.635095 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.635124 kubelet[1946]: W0213 20:12:16.635121 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.635472 kubelet[1946]: E0213 20:12:16.635446 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.635588 kubelet[1946]: W0213 20:12:16.635477 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.635588 kubelet[1946]: E0213 20:12:16.635546 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.635864 kubelet[1946]: E0213 20:12:16.635840 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.635935 kubelet[1946]: W0213 20:12:16.635860 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.635935 kubelet[1946]: E0213 20:12:16.635889 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.636186 kubelet[1946]: E0213 20:12:16.636162 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.636257 kubelet[1946]: W0213 20:12:16.636184 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.636257 kubelet[1946]: E0213 20:12:16.636211 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.636555 kubelet[1946]: E0213 20:12:16.636531 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.636555 kubelet[1946]: W0213 20:12:16.636551 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.636683 kubelet[1946]: E0213 20:12:16.636569 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.637069 kubelet[1946]: E0213 20:12:16.635154 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.637069 kubelet[1946]: E0213 20:12:16.637026 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.637069 kubelet[1946]: W0213 20:12:16.637040 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.637237 kubelet[1946]: E0213 20:12:16.637064 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.637694 kubelet[1946]: E0213 20:12:16.637501 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.637694 kubelet[1946]: W0213 20:12:16.637522 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.637694 kubelet[1946]: E0213 20:12:16.637582 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.638505 kubelet[1946]: E0213 20:12:16.637997 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.638505 kubelet[1946]: W0213 20:12:16.638016 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.638505 kubelet[1946]: E0213 20:12:16.638079 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.638505 kubelet[1946]: E0213 20:12:16.638398 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.638505 kubelet[1946]: W0213 20:12:16.638411 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.638761 kubelet[1946]: E0213 20:12:16.638516 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.638829 kubelet[1946]: E0213 20:12:16.638793 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.638829 kubelet[1946]: W0213 20:12:16.638806 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.638930 kubelet[1946]: E0213 20:12:16.638916 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.639203 kubelet[1946]: E0213 20:12:16.639178 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.639203 kubelet[1946]: W0213 20:12:16.639196 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.639558 kubelet[1946]: E0213 20:12:16.639535 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.639558 kubelet[1946]: W0213 20:12:16.639555 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.639862 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.644512 kubelet[1946]: W0213 20:12:16.639881 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640178 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640205 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640229 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640283 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.644512 kubelet[1946]: W0213 20:12:16.640295 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640319 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.644512 kubelet[1946]: E0213 20:12:16.640604 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.644512 kubelet[1946]: W0213 20:12:16.640617 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.640663 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.640937 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645024 kubelet[1946]: W0213 20:12:16.640961 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.640979 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.642977 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645024 kubelet[1946]: W0213 20:12:16.642991 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.643006 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.643269 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645024 kubelet[1946]: W0213 20:12:16.643283 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645024 kubelet[1946]: E0213 20:12:16.643298 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.643542 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645408 kubelet[1946]: W0213 20:12:16.643564 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.643579 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.643857 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645408 kubelet[1946]: W0213 20:12:16.643872 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.643887 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.644116 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645408 kubelet[1946]: W0213 20:12:16.644129 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.644165 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.645408 kubelet[1946]: E0213 20:12:16.644566 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.645911 kubelet[1946]: W0213 20:12:16.644581 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.645911 kubelet[1946]: E0213 20:12:16.644596 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.651584 kubelet[1946]: E0213 20:12:16.651552 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.651584 kubelet[1946]: W0213 20:12:16.651577 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.651768 kubelet[1946]: E0213 20:12:16.651596 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.661522 kubelet[1946]: E0213 20:12:16.659612 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.661522 kubelet[1946]: W0213 20:12:16.659678 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.661522 kubelet[1946]: E0213 20:12:16.659707 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.661821 kubelet[1946]: E0213 20:12:16.661798 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.661939 kubelet[1946]: W0213 20:12:16.661917 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.662064 kubelet[1946]: E0213 20:12:16.662041 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.681405 kubelet[1946]: E0213 20:12:16.681336 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:16.685252 kubelet[1946]: W0213 20:12:16.683229 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:16.685252 kubelet[1946]: E0213 20:12:16.683827 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:16.853943 containerd[1518]: time="2025-02-13T20:12:16.853836577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-s672z,Uid:aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183,Namespace:kube-system,Attempt:0,}" Feb 13 20:12:16.858893 containerd[1518]: time="2025-02-13T20:12:16.858435482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79pxv,Uid:2c00cac7-d73c-4027-84a1-c306a671c9cf,Namespace:calico-system,Attempt:0,}" Feb 13 20:12:17.497368 kubelet[1946]: E0213 20:12:17.497238 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:17.513341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3675740771.mount: Deactivated successfully. Feb 13 20:12:17.523512 containerd[1518]: time="2025-02-13T20:12:17.522980512Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 20:12:17.524904 containerd[1518]: time="2025-02-13T20:12:17.524870617Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 20:12:17.526434 containerd[1518]: time="2025-02-13T20:12:17.526367359Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 20:12:17.526434 containerd[1518]: time="2025-02-13T20:12:17.526406441Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Feb 13 20:12:17.526767 containerd[1518]: time="2025-02-13T20:12:17.526571569Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 20:12:17.535116 containerd[1518]: time="2025-02-13T20:12:17.535060829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 20:12:17.537395 containerd[1518]: time="2025-02-13T20:12:17.536394311Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 682.207456ms" Feb 13 20:12:17.539812 containerd[1518]: time="2025-02-13T20:12:17.539627289Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 681.037695ms" Feb 13 20:12:17.686543 containerd[1518]: time="2025-02-13T20:12:17.686112636Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:12:17.688494 containerd[1518]: time="2025-02-13T20:12:17.686462117Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:12:17.688494 containerd[1518]: time="2025-02-13T20:12:17.687544841Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:17.688494 containerd[1518]: time="2025-02-13T20:12:17.687733631Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:17.688873 containerd[1518]: time="2025-02-13T20:12:17.688792635Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:12:17.689032 containerd[1518]: time="2025-02-13T20:12:17.688851483Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:12:17.689271 containerd[1518]: time="2025-02-13T20:12:17.689217247Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:17.689734 containerd[1518]: time="2025-02-13T20:12:17.689576318Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:17.834044 systemd[1]: Started cri-containerd-3b56ae0ffa7f873d20ac30a20165334c34a3784eadb03fdd131d7ca489851384.scope - libcontainer container 3b56ae0ffa7f873d20ac30a20165334c34a3784eadb03fdd131d7ca489851384. Feb 13 20:12:17.836547 systemd[1]: Started cri-containerd-f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8.scope - libcontainer container f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8. Feb 13 20:12:17.889704 containerd[1518]: time="2025-02-13T20:12:17.889532157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-s672z,Uid:aaf8cd5e-ea06-4c63-a6f8-ea6a821d8183,Namespace:kube-system,Attempt:0,} returns sandbox id \"3b56ae0ffa7f873d20ac30a20165334c34a3784eadb03fdd131d7ca489851384\"" Feb 13 20:12:17.891820 containerd[1518]: time="2025-02-13T20:12:17.891770798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79pxv,Uid:2c00cac7-d73c-4027-84a1-c306a671c9cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\"" Feb 13 20:12:17.895238 containerd[1518]: time="2025-02-13T20:12:17.895196650Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.6\"" Feb 13 20:12:18.497705 kubelet[1946]: E0213 20:12:18.497556 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:18.591819 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 13 20:12:18.654201 kubelet[1946]: E0213 20:12:18.653368 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:19.422906 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3606330539.mount: Deactivated successfully. Feb 13 20:12:19.498941 kubelet[1946]: E0213 20:12:19.498893 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:19.905074 systemd[1]: Started sshd@9-10.244.13.70:22-137.184.188.240:38280.service - OpenSSH per-connection server daemon (137.184.188.240:38280). Feb 13 20:12:20.221049 containerd[1518]: time="2025-02-13T20:12:20.220854474Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:20.222525 containerd[1518]: time="2025-02-13T20:12:20.222318165Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.6: active requests=0, bytes read=30229116" Feb 13 20:12:20.223548 containerd[1518]: time="2025-02-13T20:12:20.223433136Z" level=info msg="ImageCreate event name:\"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:20.232214 containerd[1518]: time="2025-02-13T20:12:20.232143440Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e72a4bc769f10b56ffdfe2cdb21d84d49d9bc194b3658648207998a5bd924b72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:20.233836 containerd[1518]: time="2025-02-13T20:12:20.233373924Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.6\" with image id \"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\", repo tag \"registry.k8s.io/kube-proxy:v1.31.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:e72a4bc769f10b56ffdfe2cdb21d84d49d9bc194b3658648207998a5bd924b72\", size \"30228127\" in 2.337567579s" Feb 13 20:12:20.233836 containerd[1518]: time="2025-02-13T20:12:20.233437138Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.6\" returns image reference \"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\"" Feb 13 20:12:20.236728 containerd[1518]: time="2025-02-13T20:12:20.236682292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 20:12:20.238416 containerd[1518]: time="2025-02-13T20:12:20.238374260Z" level=info msg="CreateContainer within sandbox \"3b56ae0ffa7f873d20ac30a20165334c34a3784eadb03fdd131d7ca489851384\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 20:12:20.294250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2100231506.mount: Deactivated successfully. Feb 13 20:12:20.324011 containerd[1518]: time="2025-02-13T20:12:20.323945840Z" level=info msg="CreateContainer within sandbox \"3b56ae0ffa7f873d20ac30a20165334c34a3784eadb03fdd131d7ca489851384\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"383192a43967e70ee8148d6d88eaaa9e601e30ad5b57ec3e6a34f52a15ce9e62\"" Feb 13 20:12:20.325358 containerd[1518]: time="2025-02-13T20:12:20.325309027Z" level=info msg="StartContainer for \"383192a43967e70ee8148d6d88eaaa9e601e30ad5b57ec3e6a34f52a15ce9e62\"" Feb 13 20:12:20.382801 systemd[1]: Started cri-containerd-383192a43967e70ee8148d6d88eaaa9e601e30ad5b57ec3e6a34f52a15ce9e62.scope - libcontainer container 383192a43967e70ee8148d6d88eaaa9e601e30ad5b57ec3e6a34f52a15ce9e62. Feb 13 20:12:20.438652 containerd[1518]: time="2025-02-13T20:12:20.438558362Z" level=info msg="StartContainer for \"383192a43967e70ee8148d6d88eaaa9e601e30ad5b57ec3e6a34f52a15ce9e62\" returns successfully" Feb 13 20:12:20.499394 kubelet[1946]: E0213 20:12:20.499225 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:20.653934 kubelet[1946]: E0213 20:12:20.653091 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:20.734525 sshd[2119]: Invalid user ospite from 137.184.188.240 port 38280 Feb 13 20:12:20.738506 kubelet[1946]: I0213 20:12:20.737438 1946 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-s672z" podStartSLOduration=3.396421651 podStartE2EDuration="5.737377103s" podCreationTimestamp="2025-02-13 20:12:15 +0000 UTC" firstStartedPulling="2025-02-13 20:12:17.894475037 +0000 UTC m=+3.618228895" lastFinishedPulling="2025-02-13 20:12:20.235430488 +0000 UTC m=+5.959184347" observedRunningTime="2025-02-13 20:12:20.736068546 +0000 UTC m=+6.459822412" watchObservedRunningTime="2025-02-13 20:12:20.737377103 +0000 UTC m=+6.461130975" Feb 13 20:12:20.760609 kubelet[1946]: E0213 20:12:20.760213 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.760609 kubelet[1946]: W0213 20:12:20.760276 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.760609 kubelet[1946]: E0213 20:12:20.760339 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.761368 kubelet[1946]: E0213 20:12:20.761083 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.761368 kubelet[1946]: W0213 20:12:20.761118 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.761368 kubelet[1946]: E0213 20:12:20.761135 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.762038 kubelet[1946]: E0213 20:12:20.761806 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.762038 kubelet[1946]: W0213 20:12:20.761827 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.762038 kubelet[1946]: E0213 20:12:20.761843 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.762206 kubelet[1946]: E0213 20:12:20.762176 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.762206 kubelet[1946]: W0213 20:12:20.762192 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.762289 kubelet[1946]: E0213 20:12:20.762227 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.762737 kubelet[1946]: E0213 20:12:20.762685 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.762737 kubelet[1946]: W0213 20:12:20.762706 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.762947 kubelet[1946]: E0213 20:12:20.762918 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.763326 kubelet[1946]: E0213 20:12:20.763305 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.763326 kubelet[1946]: W0213 20:12:20.763325 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.763804 kubelet[1946]: E0213 20:12:20.763343 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.764379 kubelet[1946]: E0213 20:12:20.764356 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.764624 kubelet[1946]: W0213 20:12:20.764514 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.764624 kubelet[1946]: E0213 20:12:20.764544 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.765554 kubelet[1946]: E0213 20:12:20.765265 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.765554 kubelet[1946]: W0213 20:12:20.765284 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.765554 kubelet[1946]: E0213 20:12:20.765301 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.766036 kubelet[1946]: E0213 20:12:20.765857 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.766036 kubelet[1946]: W0213 20:12:20.765877 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.766036 kubelet[1946]: E0213 20:12:20.765893 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.766576 kubelet[1946]: E0213 20:12:20.766362 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.766576 kubelet[1946]: W0213 20:12:20.766384 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.766576 kubelet[1946]: E0213 20:12:20.766400 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.767396 kubelet[1946]: E0213 20:12:20.766936 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.767396 kubelet[1946]: W0213 20:12:20.766957 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.767396 kubelet[1946]: E0213 20:12:20.766973 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.767854 kubelet[1946]: E0213 20:12:20.767623 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.767854 kubelet[1946]: W0213 20:12:20.767637 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.767854 kubelet[1946]: E0213 20:12:20.767696 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.768466 kubelet[1946]: E0213 20:12:20.768313 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.768466 kubelet[1946]: W0213 20:12:20.768332 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.768466 kubelet[1946]: E0213 20:12:20.768347 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.769336 kubelet[1946]: E0213 20:12:20.769185 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.769336 kubelet[1946]: W0213 20:12:20.769205 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.769336 kubelet[1946]: E0213 20:12:20.769220 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.769689 kubelet[1946]: E0213 20:12:20.769605 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.769974 kubelet[1946]: W0213 20:12:20.769788 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.770202 kubelet[1946]: E0213 20:12:20.770082 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.770554 kubelet[1946]: E0213 20:12:20.770521 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.770785 kubelet[1946]: W0213 20:12:20.770662 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.770785 kubelet[1946]: E0213 20:12:20.770755 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.771691 kubelet[1946]: E0213 20:12:20.771417 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.771691 kubelet[1946]: W0213 20:12:20.771436 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.771691 kubelet[1946]: E0213 20:12:20.771453 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.772107 kubelet[1946]: E0213 20:12:20.771923 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.772107 kubelet[1946]: W0213 20:12:20.771942 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.772107 kubelet[1946]: E0213 20:12:20.771958 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.772734 kubelet[1946]: E0213 20:12:20.772449 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.772734 kubelet[1946]: W0213 20:12:20.772467 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.772734 kubelet[1946]: E0213 20:12:20.772505 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.773036 kubelet[1946]: E0213 20:12:20.772953 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.773036 kubelet[1946]: W0213 20:12:20.772974 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.773036 kubelet[1946]: E0213 20:12:20.772990 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.873675 kubelet[1946]: E0213 20:12:20.873379 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.873675 kubelet[1946]: W0213 20:12:20.873413 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.873675 kubelet[1946]: E0213 20:12:20.873444 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.874404 kubelet[1946]: E0213 20:12:20.874204 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.874404 kubelet[1946]: W0213 20:12:20.874223 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.874404 kubelet[1946]: E0213 20:12:20.874248 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.874731 kubelet[1946]: E0213 20:12:20.874711 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.874996 kubelet[1946]: W0213 20:12:20.874823 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.874996 kubelet[1946]: E0213 20:12:20.874857 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.875291 kubelet[1946]: E0213 20:12:20.875270 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.875542 kubelet[1946]: W0213 20:12:20.875390 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.875542 kubelet[1946]: E0213 20:12:20.875414 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.875992 kubelet[1946]: E0213 20:12:20.875859 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.875992 kubelet[1946]: W0213 20:12:20.875881 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.875992 kubelet[1946]: E0213 20:12:20.875921 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.876388 kubelet[1946]: E0213 20:12:20.876308 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.876388 kubelet[1946]: W0213 20:12:20.876327 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.876388 kubelet[1946]: E0213 20:12:20.876351 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.878307 kubelet[1946]: E0213 20:12:20.877828 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.878307 kubelet[1946]: W0213 20:12:20.877849 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.878307 kubelet[1946]: E0213 20:12:20.877873 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.878778 kubelet[1946]: E0213 20:12:20.878743 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.878899 kubelet[1946]: W0213 20:12:20.878878 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.879064 kubelet[1946]: E0213 20:12:20.879031 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.879379 kubelet[1946]: E0213 20:12:20.879359 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.879551 kubelet[1946]: W0213 20:12:20.879450 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.879551 kubelet[1946]: E0213 20:12:20.879473 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.880069 kubelet[1946]: E0213 20:12:20.879902 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.880069 kubelet[1946]: W0213 20:12:20.879920 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.880069 kubelet[1946]: E0213 20:12:20.879935 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.880889 kubelet[1946]: E0213 20:12:20.880470 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.880889 kubelet[1946]: W0213 20:12:20.880505 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.880889 kubelet[1946]: E0213 20:12:20.880525 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.881305 kubelet[1946]: E0213 20:12:20.881285 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:20.881468 kubelet[1946]: W0213 20:12:20.881396 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:20.881468 kubelet[1946]: E0213 20:12:20.881420 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:20.885782 sshd[2119]: Received disconnect from 137.184.188.240 port 38280:11: Bye Bye [preauth] Feb 13 20:12:20.885782 sshd[2119]: Disconnected from invalid user ospite 137.184.188.240 port 38280 [preauth] Feb 13 20:12:20.888048 systemd[1]: sshd@9-10.244.13.70:22-137.184.188.240:38280.service: Deactivated successfully. Feb 13 20:12:21.500011 kubelet[1946]: E0213 20:12:21.499920 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:21.779215 kubelet[1946]: E0213 20:12:21.779031 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.779215 kubelet[1946]: W0213 20:12:21.779078 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.779215 kubelet[1946]: E0213 20:12:21.779118 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.780744 kubelet[1946]: E0213 20:12:21.780236 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.780744 kubelet[1946]: W0213 20:12:21.780254 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.780744 kubelet[1946]: E0213 20:12:21.780308 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.781061 kubelet[1946]: E0213 20:12:21.780791 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.781061 kubelet[1946]: W0213 20:12:21.780805 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.781061 kubelet[1946]: E0213 20:12:21.780821 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.782683 kubelet[1946]: E0213 20:12:21.781777 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.782683 kubelet[1946]: W0213 20:12:21.781796 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.782683 kubelet[1946]: E0213 20:12:21.781812 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.783106 kubelet[1946]: E0213 20:12:21.782935 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.783106 kubelet[1946]: W0213 20:12:21.782955 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.783106 kubelet[1946]: E0213 20:12:21.782972 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.783571 kubelet[1946]: E0213 20:12:21.783379 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.783571 kubelet[1946]: W0213 20:12:21.783407 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.783571 kubelet[1946]: E0213 20:12:21.783423 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.784068 kubelet[1946]: E0213 20:12:21.783907 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.784068 kubelet[1946]: W0213 20:12:21.783936 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.784068 kubelet[1946]: E0213 20:12:21.783952 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.784772 kubelet[1946]: E0213 20:12:21.784473 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.784772 kubelet[1946]: W0213 20:12:21.784629 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.784772 kubelet[1946]: E0213 20:12:21.784648 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.785857 kubelet[1946]: E0213 20:12:21.785632 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.785857 kubelet[1946]: W0213 20:12:21.785650 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.785857 kubelet[1946]: E0213 20:12:21.785677 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.786209 kubelet[1946]: E0213 20:12:21.786103 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.786209 kubelet[1946]: W0213 20:12:21.786138 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.786209 kubelet[1946]: E0213 20:12:21.786168 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.788128 kubelet[1946]: E0213 20:12:21.787976 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.788128 kubelet[1946]: W0213 20:12:21.787996 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.788128 kubelet[1946]: E0213 20:12:21.788012 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.789318 kubelet[1946]: E0213 20:12:21.788791 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.789318 kubelet[1946]: W0213 20:12:21.788810 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.789318 kubelet[1946]: E0213 20:12:21.789131 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.789904 kubelet[1946]: E0213 20:12:21.789627 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.789904 kubelet[1946]: W0213 20:12:21.789645 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.789904 kubelet[1946]: E0213 20:12:21.789674 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.790708 kubelet[1946]: E0213 20:12:21.790521 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.790708 kubelet[1946]: W0213 20:12:21.790548 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.790708 kubelet[1946]: E0213 20:12:21.790565 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.791271 kubelet[1946]: E0213 20:12:21.791034 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.791271 kubelet[1946]: W0213 20:12:21.791053 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.791271 kubelet[1946]: E0213 20:12:21.791069 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.792155 kubelet[1946]: E0213 20:12:21.791897 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.792155 kubelet[1946]: W0213 20:12:21.791916 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.792155 kubelet[1946]: E0213 20:12:21.791931 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.792749 kubelet[1946]: E0213 20:12:21.792462 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.792749 kubelet[1946]: W0213 20:12:21.792480 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.792749 kubelet[1946]: E0213 20:12:21.792517 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.793614 kubelet[1946]: E0213 20:12:21.793391 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.793614 kubelet[1946]: W0213 20:12:21.793409 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.793614 kubelet[1946]: E0213 20:12:21.793427 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.794502 kubelet[1946]: E0213 20:12:21.794315 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.794502 kubelet[1946]: W0213 20:12:21.794334 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.794502 kubelet[1946]: E0213 20:12:21.794350 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.795382 kubelet[1946]: E0213 20:12:21.794823 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.795382 kubelet[1946]: W0213 20:12:21.794842 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.795382 kubelet[1946]: E0213 20:12:21.794858 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.795796 kubelet[1946]: E0213 20:12:21.795777 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.796211 kubelet[1946]: W0213 20:12:21.796136 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.796211 kubelet[1946]: E0213 20:12:21.796173 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.797252 kubelet[1946]: E0213 20:12:21.796819 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.797252 kubelet[1946]: W0213 20:12:21.796838 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.797252 kubelet[1946]: E0213 20:12:21.796871 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.797622 kubelet[1946]: E0213 20:12:21.797542 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.797622 kubelet[1946]: W0213 20:12:21.797594 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.798194 kubelet[1946]: E0213 20:12:21.797812 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.798387 kubelet[1946]: E0213 20:12:21.798368 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.798553 kubelet[1946]: W0213 20:12:21.798531 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.799176 kubelet[1946]: E0213 20:12:21.799024 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.799362 kubelet[1946]: E0213 20:12:21.799343 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.799472 kubelet[1946]: W0213 20:12:21.799443 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.799803 kubelet[1946]: E0213 20:12:21.799597 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.800395 kubelet[1946]: E0213 20:12:21.800210 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.800395 kubelet[1946]: W0213 20:12:21.800228 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.800395 kubelet[1946]: E0213 20:12:21.800261 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.801559 kubelet[1946]: E0213 20:12:21.801270 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.801559 kubelet[1946]: W0213 20:12:21.801288 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.801559 kubelet[1946]: E0213 20:12:21.801310 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.802628 kubelet[1946]: E0213 20:12:21.802360 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.802628 kubelet[1946]: W0213 20:12:21.802380 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.802628 kubelet[1946]: E0213 20:12:21.802474 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.803162 kubelet[1946]: E0213 20:12:21.802794 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.803162 kubelet[1946]: W0213 20:12:21.802808 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.803162 kubelet[1946]: E0213 20:12:21.802853 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.803624 kubelet[1946]: E0213 20:12:21.803433 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.803624 kubelet[1946]: W0213 20:12:21.803450 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.803624 kubelet[1946]: E0213 20:12:21.803466 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.804134 kubelet[1946]: E0213 20:12:21.803866 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.804134 kubelet[1946]: W0213 20:12:21.803886 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.804134 kubelet[1946]: E0213 20:12:21.803902 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.805312 kubelet[1946]: E0213 20:12:21.805291 1946 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 20:12:21.805475 kubelet[1946]: W0213 20:12:21.805416 1946 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 20:12:21.805475 kubelet[1946]: E0213 20:12:21.805442 1946 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 20:12:21.831511 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3816361526.mount: Deactivated successfully. Feb 13 20:12:21.971629 containerd[1518]: time="2025-02-13T20:12:21.971545007Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:21.973477 containerd[1518]: time="2025-02-13T20:12:21.973092023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Feb 13 20:12:21.976113 containerd[1518]: time="2025-02-13T20:12:21.974576125Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:21.978292 containerd[1518]: time="2025-02-13T20:12:21.976998970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:21.978292 containerd[1518]: time="2025-02-13T20:12:21.978132658Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.741397846s" Feb 13 20:12:21.978292 containerd[1518]: time="2025-02-13T20:12:21.978183240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 20:12:21.981019 containerd[1518]: time="2025-02-13T20:12:21.980986243Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 20:12:22.002042 containerd[1518]: time="2025-02-13T20:12:22.001987336Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75\"" Feb 13 20:12:22.003090 containerd[1518]: time="2025-02-13T20:12:22.003057783Z" level=info msg="StartContainer for \"3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75\"" Feb 13 20:12:22.046848 systemd[1]: Started cri-containerd-3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75.scope - libcontainer container 3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75. Feb 13 20:12:22.098331 containerd[1518]: time="2025-02-13T20:12:22.098176853Z" level=info msg="StartContainer for \"3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75\" returns successfully" Feb 13 20:12:22.117672 systemd[1]: cri-containerd-3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75.scope: Deactivated successfully. Feb 13 20:12:22.434896 containerd[1518]: time="2025-02-13T20:12:22.434635197Z" level=info msg="shim disconnected" id=3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75 namespace=k8s.io Feb 13 20:12:22.434896 containerd[1518]: time="2025-02-13T20:12:22.434770082Z" level=warning msg="cleaning up after shim disconnected" id=3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75 namespace=k8s.io Feb 13 20:12:22.434896 containerd[1518]: time="2025-02-13T20:12:22.434793522Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 20:12:22.501197 kubelet[1946]: E0213 20:12:22.501079 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:22.653209 kubelet[1946]: E0213 20:12:22.653067 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:22.729715 containerd[1518]: time="2025-02-13T20:12:22.728894655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 20:12:22.761191 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b30bc5094809417ac21a202acf0e1e51d009993ec8a0cda19734776fbac9e75-rootfs.mount: Deactivated successfully. Feb 13 20:12:23.502113 kubelet[1946]: E0213 20:12:23.502029 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:24.502281 kubelet[1946]: E0213 20:12:24.502222 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:24.652892 kubelet[1946]: E0213 20:12:24.652753 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:25.505602 kubelet[1946]: E0213 20:12:25.503942 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:26.504448 kubelet[1946]: E0213 20:12:26.504349 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:26.654247 kubelet[1946]: E0213 20:12:26.653802 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:27.505286 kubelet[1946]: E0213 20:12:27.505111 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:28.506409 kubelet[1946]: E0213 20:12:28.506294 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:28.511918 containerd[1518]: time="2025-02-13T20:12:28.510761862Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:28.513785 containerd[1518]: time="2025-02-13T20:12:28.513199725Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 20:12:28.514686 containerd[1518]: time="2025-02-13T20:12:28.514628886Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:28.517326 containerd[1518]: time="2025-02-13T20:12:28.517235998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:28.518555 containerd[1518]: time="2025-02-13T20:12:28.518517942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.789511213s" Feb 13 20:12:28.518646 containerd[1518]: time="2025-02-13T20:12:28.518559645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 20:12:28.523394 containerd[1518]: time="2025-02-13T20:12:28.523347080Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 20:12:28.547861 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3850580443.mount: Deactivated successfully. Feb 13 20:12:28.549677 containerd[1518]: time="2025-02-13T20:12:28.548053737Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9\"" Feb 13 20:12:28.550210 containerd[1518]: time="2025-02-13T20:12:28.550160582Z" level=info msg="StartContainer for \"0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9\"" Feb 13 20:12:28.597344 systemd[1]: run-containerd-runc-k8s.io-0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9-runc.jldrnz.mount: Deactivated successfully. Feb 13 20:12:28.610893 systemd[1]: Started cri-containerd-0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9.scope - libcontainer container 0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9. Feb 13 20:12:28.653292 kubelet[1946]: E0213 20:12:28.653216 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:28.663511 containerd[1518]: time="2025-02-13T20:12:28.663434189Z" level=info msg="StartContainer for \"0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9\" returns successfully" Feb 13 20:12:29.449887 containerd[1518]: time="2025-02-13T20:12:29.449434300Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 20:12:29.450852 systemd[1]: cri-containerd-0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9.scope: Deactivated successfully. Feb 13 20:12:29.451394 systemd[1]: cri-containerd-0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9.scope: Consumed 622ms CPU time, 169.8M memory peak, 151M written to disk. Feb 13 20:12:29.509168 kubelet[1946]: E0213 20:12:29.509053 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:29.524012 kubelet[1946]: I0213 20:12:29.523981 1946 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Feb 13 20:12:29.537794 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9-rootfs.mount: Deactivated successfully. Feb 13 20:12:29.784530 containerd[1518]: time="2025-02-13T20:12:29.784263564Z" level=info msg="shim disconnected" id=0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9 namespace=k8s.io Feb 13 20:12:29.784530 containerd[1518]: time="2025-02-13T20:12:29.784412599Z" level=warning msg="cleaning up after shim disconnected" id=0f6a4aaa5aa52312524378193e0e067311af61c8bfc283be09b43bc2258329f9 namespace=k8s.io Feb 13 20:12:29.784530 containerd[1518]: time="2025-02-13T20:12:29.784444540Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 20:12:30.510351 kubelet[1946]: E0213 20:12:30.510268 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:30.661801 systemd[1]: Created slice kubepods-besteffort-podd5b9c20a_bc22_40d8_8d45_69955889fccc.slice - libcontainer container kubepods-besteffort-podd5b9c20a_bc22_40d8_8d45_69955889fccc.slice. Feb 13 20:12:30.666174 containerd[1518]: time="2025-02-13T20:12:30.666123837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:0,}" Feb 13 20:12:30.760534 containerd[1518]: time="2025-02-13T20:12:30.760141171Z" level=error msg="Failed to destroy network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:30.765235 containerd[1518]: time="2025-02-13T20:12:30.764789705Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 20:12:30.765701 containerd[1518]: time="2025-02-13T20:12:30.765653242Z" level=error msg="encountered an error cleaning up failed sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:30.766039 containerd[1518]: time="2025-02-13T20:12:30.765907418Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:30.765953 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a-shm.mount: Deactivated successfully. Feb 13 20:12:30.766548 kubelet[1946]: E0213 20:12:30.766424 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:30.766663 kubelet[1946]: E0213 20:12:30.766585 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:30.766663 kubelet[1946]: E0213 20:12:30.766640 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:30.766836 kubelet[1946]: E0213 20:12:30.766711 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:31.511010 kubelet[1946]: E0213 20:12:31.510879 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:31.767241 kubelet[1946]: I0213 20:12:31.767206 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a" Feb 13 20:12:31.769219 containerd[1518]: time="2025-02-13T20:12:31.769163370Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:31.769741 containerd[1518]: time="2025-02-13T20:12:31.769545907Z" level=info msg="Ensure that sandbox 87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a in task-service has been cleanup successfully" Feb 13 20:12:31.770810 containerd[1518]: time="2025-02-13T20:12:31.770639218Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:31.770810 containerd[1518]: time="2025-02-13T20:12:31.770679555Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:31.772804 containerd[1518]: time="2025-02-13T20:12:31.772289023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:1,}" Feb 13 20:12:31.773931 systemd[1]: run-netns-cni\x2dd281a17d\x2dba49\x2d557a\x2d1de2\x2d948f43016c97.mount: Deactivated successfully. Feb 13 20:12:31.872321 containerd[1518]: time="2025-02-13T20:12:31.872221439Z" level=error msg="Failed to destroy network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:31.874999 containerd[1518]: time="2025-02-13T20:12:31.874946110Z" level=error msg="encountered an error cleaning up failed sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:31.875109 containerd[1518]: time="2025-02-13T20:12:31.875042739Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:31.875393 kubelet[1946]: E0213 20:12:31.875344 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:31.875504 kubelet[1946]: E0213 20:12:31.875434 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:31.875504 kubelet[1946]: E0213 20:12:31.875473 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:31.875633 kubelet[1946]: E0213 20:12:31.875559 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:31.876617 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e-shm.mount: Deactivated successfully. Feb 13 20:12:32.061911 update_engine[1511]: I20250213 20:12:32.061620 1511 update_attempter.cc:509] Updating boot flags... Feb 13 20:12:32.140809 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2561) Feb 13 20:12:32.314557 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2560) Feb 13 20:12:32.511195 kubelet[1946]: E0213 20:12:32.511123 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:32.771513 kubelet[1946]: I0213 20:12:32.771443 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e" Feb 13 20:12:32.773119 containerd[1518]: time="2025-02-13T20:12:32.773000675Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:32.775762 containerd[1518]: time="2025-02-13T20:12:32.773327445Z" level=info msg="Ensure that sandbox 29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e in task-service has been cleanup successfully" Feb 13 20:12:32.778799 containerd[1518]: time="2025-02-13T20:12:32.778615523Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:32.778799 containerd[1518]: time="2025-02-13T20:12:32.778660764Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:32.778955 systemd[1]: run-netns-cni\x2d3ca6bccb\x2dcf0a\x2d8cac\x2d0ab3\x2d5ff7db9e7e50.mount: Deactivated successfully. Feb 13 20:12:32.780461 containerd[1518]: time="2025-02-13T20:12:32.779771062Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:32.780461 containerd[1518]: time="2025-02-13T20:12:32.779911357Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:32.780461 containerd[1518]: time="2025-02-13T20:12:32.779931059Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:32.781539 containerd[1518]: time="2025-02-13T20:12:32.781208636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:2,}" Feb 13 20:12:32.897390 containerd[1518]: time="2025-02-13T20:12:32.897296838Z" level=error msg="Failed to destroy network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:32.901380 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c-shm.mount: Deactivated successfully. Feb 13 20:12:32.904219 kubelet[1946]: E0213 20:12:32.902535 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:32.904219 kubelet[1946]: E0213 20:12:32.902610 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:32.904219 kubelet[1946]: E0213 20:12:32.902649 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:32.904450 containerd[1518]: time="2025-02-13T20:12:32.901880049Z" level=error msg="encountered an error cleaning up failed sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:32.904450 containerd[1518]: time="2025-02-13T20:12:32.902018035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:32.904590 kubelet[1946]: E0213 20:12:32.902707 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:33.512017 kubelet[1946]: E0213 20:12:33.511856 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:33.784754 kubelet[1946]: I0213 20:12:33.784685 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.785705396Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.786361108Z" level=info msg="Ensure that sandbox bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c in task-service has been cleanup successfully" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.789622552Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.789651780Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.789999838Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.790100851Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.790121712Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:33.790563 containerd[1518]: time="2025-02-13T20:12:33.790388760Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:33.791467 containerd[1518]: time="2025-02-13T20:12:33.790595327Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:33.791467 containerd[1518]: time="2025-02-13T20:12:33.790615488Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:33.791467 containerd[1518]: time="2025-02-13T20:12:33.791147624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:3,}" Feb 13 20:12:33.793114 systemd[1]: run-netns-cni\x2dd0c19769\x2dafd4\x2d5d0d\x2d9a83\x2d69e08aa15b7c.mount: Deactivated successfully. Feb 13 20:12:33.826890 systemd[1]: Created slice kubepods-besteffort-pod3ec74cf6_af99_4f93_a98f_a1049d5e3987.slice - libcontainer container kubepods-besteffort-pod3ec74cf6_af99_4f93_a98f_a1049d5e3987.slice. Feb 13 20:12:33.892439 kubelet[1946]: I0213 20:12:33.892219 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkmmx\" (UniqueName: \"kubernetes.io/projected/3ec74cf6-af99-4f93-a98f-a1049d5e3987-kube-api-access-dkmmx\") pod \"nginx-deployment-8587fbcb89-m6z8q\" (UID: \"3ec74cf6-af99-4f93-a98f-a1049d5e3987\") " pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:33.948514 containerd[1518]: time="2025-02-13T20:12:33.945746077Z" level=error msg="Failed to destroy network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:33.948725 containerd[1518]: time="2025-02-13T20:12:33.946450799Z" level=error msg="encountered an error cleaning up failed sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:33.948725 containerd[1518]: time="2025-02-13T20:12:33.948634002Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:33.950128 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39-shm.mount: Deactivated successfully. Feb 13 20:12:33.950690 kubelet[1946]: E0213 20:12:33.949427 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:33.950868 kubelet[1946]: E0213 20:12:33.950715 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:33.950868 kubelet[1946]: E0213 20:12:33.950787 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:33.952613 kubelet[1946]: E0213 20:12:33.950946 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:34.140615 containerd[1518]: time="2025-02-13T20:12:34.140442155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:0,}" Feb 13 20:12:34.255640 containerd[1518]: time="2025-02-13T20:12:34.255563164Z" level=error msg="Failed to destroy network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.256454 containerd[1518]: time="2025-02-13T20:12:34.256311263Z" level=error msg="encountered an error cleaning up failed sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.256454 containerd[1518]: time="2025-02-13T20:12:34.256389781Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.257465 kubelet[1946]: E0213 20:12:34.256922 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.257465 kubelet[1946]: E0213 20:12:34.257035 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:34.257465 kubelet[1946]: E0213 20:12:34.257070 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:34.257703 kubelet[1946]: E0213 20:12:34.257157 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:34.513501 kubelet[1946]: E0213 20:12:34.513427 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:34.796868 kubelet[1946]: I0213 20:12:34.795302 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39" Feb 13 20:12:34.798016 containerd[1518]: time="2025-02-13T20:12:34.797971901Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:34.801127 containerd[1518]: time="2025-02-13T20:12:34.798317137Z" level=info msg="Ensure that sandbox dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39 in task-service has been cleanup successfully" Feb 13 20:12:34.801127 containerd[1518]: time="2025-02-13T20:12:34.798583456Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:34.801127 containerd[1518]: time="2025-02-13T20:12:34.798605686Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:34.801146 systemd[1]: run-netns-cni\x2d72bea2ae\x2d5941\x2df70b\x2d2b0d\x2dca537d4ddff9.mount: Deactivated successfully. Feb 13 20:12:34.812065 kubelet[1946]: I0213 20:12:34.809701 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857" Feb 13 20:12:34.812519 containerd[1518]: time="2025-02-13T20:12:34.812460715Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:34.812953 containerd[1518]: time="2025-02-13T20:12:34.812922357Z" level=info msg="Ensure that sandbox 03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857 in task-service has been cleanup successfully" Feb 13 20:12:34.816734 containerd[1518]: time="2025-02-13T20:12:34.816689305Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:34.816875 containerd[1518]: time="2025-02-13T20:12:34.816850083Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:34.817108 containerd[1518]: time="2025-02-13T20:12:34.817078839Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:34.817334 containerd[1518]: time="2025-02-13T20:12:34.817299279Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:34.817794 systemd[1]: run-netns-cni\x2d262f1035\x2db651\x2dfc18\x2d6c8b\x2db8ec50163229.mount: Deactivated successfully. Feb 13 20:12:34.818019 containerd[1518]: time="2025-02-13T20:12:34.817991612Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:34.820591 containerd[1518]: time="2025-02-13T20:12:34.820552011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:1,}" Feb 13 20:12:34.828260 containerd[1518]: time="2025-02-13T20:12:34.828199414Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:34.828468 containerd[1518]: time="2025-02-13T20:12:34.828397434Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:34.828468 containerd[1518]: time="2025-02-13T20:12:34.828419628Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:34.831948 containerd[1518]: time="2025-02-13T20:12:34.831881558Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:34.832088 containerd[1518]: time="2025-02-13T20:12:34.832009535Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:34.832088 containerd[1518]: time="2025-02-13T20:12:34.832028588Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:34.834673 containerd[1518]: time="2025-02-13T20:12:34.834526969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:4,}" Feb 13 20:12:34.997326 containerd[1518]: time="2025-02-13T20:12:34.997246554Z" level=error msg="Failed to destroy network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.998322 containerd[1518]: time="2025-02-13T20:12:34.998273448Z" level=error msg="encountered an error cleaning up failed sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.998595 containerd[1518]: time="2025-02-13T20:12:34.998411578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.999149 kubelet[1946]: E0213 20:12:34.999058 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:34.999401 kubelet[1946]: E0213 20:12:34.999166 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:34.999401 kubelet[1946]: E0213 20:12:34.999219 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:34.999401 kubelet[1946]: E0213 20:12:34.999294 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:35.016967 containerd[1518]: time="2025-02-13T20:12:35.016781726Z" level=error msg="Failed to destroy network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:35.018198 containerd[1518]: time="2025-02-13T20:12:35.017913252Z" level=error msg="encountered an error cleaning up failed sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:35.018198 containerd[1518]: time="2025-02-13T20:12:35.018018923Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:35.018450 kubelet[1946]: E0213 20:12:35.018357 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:35.019016 kubelet[1946]: E0213 20:12:35.018453 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:35.019016 kubelet[1946]: E0213 20:12:35.018506 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:35.019016 kubelet[1946]: E0213 20:12:35.018581 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:35.492439 kubelet[1946]: E0213 20:12:35.492107 1946 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:35.514516 kubelet[1946]: E0213 20:12:35.514291 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:35.790358 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533-shm.mount: Deactivated successfully. Feb 13 20:12:35.816313 kubelet[1946]: I0213 20:12:35.815289 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e" Feb 13 20:12:35.816554 containerd[1518]: time="2025-02-13T20:12:35.816293805Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.816642309Z" level=info msg="Ensure that sandbox 361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e in task-service has been cleanup successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.816874795Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.816898924Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.819412589Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.819535039Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.819557804Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.820048449Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.820146797Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:35.820666 containerd[1518]: time="2025-02-13T20:12:35.820166230Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:35.821220 kubelet[1946]: I0213 20:12:35.820412 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533" Feb 13 20:12:35.819328 systemd[1]: run-netns-cni\x2d19311f53\x2d812a\x2df0f6\x2d5155\x2d17dd33170eec.mount: Deactivated successfully. Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.820904139Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821121378Z" level=info msg="Ensure that sandbox b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533 in task-service has been cleanup successfully" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821344173Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821364897Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821433233Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821580412Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.821599135Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:35.823137 containerd[1518]: time="2025-02-13T20:12:35.823123435Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:35.823536 containerd[1518]: time="2025-02-13T20:12:35.823225562Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:35.823536 containerd[1518]: time="2025-02-13T20:12:35.823244194Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:35.823536 containerd[1518]: time="2025-02-13T20:12:35.823330152Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:35.823536 containerd[1518]: time="2025-02-13T20:12:35.823422107Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:35.823536 containerd[1518]: time="2025-02-13T20:12:35.823438886Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:35.825440 systemd[1]: run-netns-cni\x2d536b0297\x2da350\x2dafa6\x2db371\x2deb21cb215730.mount: Deactivated successfully. Feb 13 20:12:35.826294 containerd[1518]: time="2025-02-13T20:12:35.826177418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:2,}" Feb 13 20:12:35.830422 containerd[1518]: time="2025-02-13T20:12:35.829680785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:5,}" Feb 13 20:12:36.006822 containerd[1518]: time="2025-02-13T20:12:36.006663035Z" level=error msg="Failed to destroy network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.007658 containerd[1518]: time="2025-02-13T20:12:36.007318170Z" level=error msg="encountered an error cleaning up failed sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.007658 containerd[1518]: time="2025-02-13T20:12:36.007403732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.008257 kubelet[1946]: E0213 20:12:36.007820 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.008257 kubelet[1946]: E0213 20:12:36.007898 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:36.008257 kubelet[1946]: E0213 20:12:36.007964 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:36.008460 kubelet[1946]: E0213 20:12:36.008057 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:36.054502 containerd[1518]: time="2025-02-13T20:12:36.054277655Z" level=error msg="Failed to destroy network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.056944 containerd[1518]: time="2025-02-13T20:12:36.056874782Z" level=error msg="encountered an error cleaning up failed sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.057162 containerd[1518]: time="2025-02-13T20:12:36.056978372Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.057406 kubelet[1946]: E0213 20:12:36.057335 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:36.057542 kubelet[1946]: E0213 20:12:36.057436 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:36.057542 kubelet[1946]: E0213 20:12:36.057478 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:36.057664 kubelet[1946]: E0213 20:12:36.057593 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:36.515613 kubelet[1946]: E0213 20:12:36.515455 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:36.794714 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5-shm.mount: Deactivated successfully. Feb 13 20:12:36.828514 kubelet[1946]: I0213 20:12:36.825881 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5" Feb 13 20:12:36.828768 containerd[1518]: time="2025-02-13T20:12:36.826727434Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:36.828768 containerd[1518]: time="2025-02-13T20:12:36.827073995Z" level=info msg="Ensure that sandbox b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5 in task-service has been cleanup successfully" Feb 13 20:12:36.832217 systemd[1]: run-netns-cni\x2d71dabd37\x2d79ec\x2dc8ec\x2d2c88\x2d5a02abfb2e9b.mount: Deactivated successfully. Feb 13 20:12:36.836794 containerd[1518]: time="2025-02-13T20:12:36.836712648Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:36.836936 containerd[1518]: time="2025-02-13T20:12:36.836792811Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:36.837726 containerd[1518]: time="2025-02-13T20:12:36.837371375Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:36.837726 containerd[1518]: time="2025-02-13T20:12:36.837536314Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:36.837726 containerd[1518]: time="2025-02-13T20:12:36.837556641Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:36.839230 containerd[1518]: time="2025-02-13T20:12:36.838860236Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:36.839230 containerd[1518]: time="2025-02-13T20:12:36.838983079Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:36.839230 containerd[1518]: time="2025-02-13T20:12:36.839002178Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:36.840501 containerd[1518]: time="2025-02-13T20:12:36.840451588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:3,}" Feb 13 20:12:36.842659 kubelet[1946]: I0213 20:12:36.842621 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f" Feb 13 20:12:36.843833 containerd[1518]: time="2025-02-13T20:12:36.843683490Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:36.844262 containerd[1518]: time="2025-02-13T20:12:36.844133649Z" level=info msg="Ensure that sandbox b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f in task-service has been cleanup successfully" Feb 13 20:12:36.844693 containerd[1518]: time="2025-02-13T20:12:36.844590182Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:36.844693 containerd[1518]: time="2025-02-13T20:12:36.844654461Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:36.849234 containerd[1518]: time="2025-02-13T20:12:36.847726702Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:36.849234 containerd[1518]: time="2025-02-13T20:12:36.847862422Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:36.849234 containerd[1518]: time="2025-02-13T20:12:36.847891253Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:36.848251 systemd[1]: run-netns-cni\x2d581da1f8\x2d3e85\x2d05bd\x2daab8\x2d490d878d7ee8.mount: Deactivated successfully. Feb 13 20:12:36.851193 containerd[1518]: time="2025-02-13T20:12:36.850981989Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:36.851193 containerd[1518]: time="2025-02-13T20:12:36.851133091Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:36.851193 containerd[1518]: time="2025-02-13T20:12:36.851152539Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.852343833Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.852477098Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.852555689Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.852888676Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.852986413Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.853003596Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.853372936Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.853466375Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:36.853947 containerd[1518]: time="2025-02-13T20:12:36.853797687Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:36.857978 containerd[1518]: time="2025-02-13T20:12:36.857535494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:6,}" Feb 13 20:12:37.056542 containerd[1518]: time="2025-02-13T20:12:37.056276897Z" level=error msg="Failed to destroy network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.058745 containerd[1518]: time="2025-02-13T20:12:37.058028323Z" level=error msg="Failed to destroy network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.058745 containerd[1518]: time="2025-02-13T20:12:37.058531018Z" level=error msg="encountered an error cleaning up failed sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.058745 containerd[1518]: time="2025-02-13T20:12:37.058586374Z" level=error msg="encountered an error cleaning up failed sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.058745 containerd[1518]: time="2025-02-13T20:12:37.058708212Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.059078 containerd[1518]: time="2025-02-13T20:12:37.058709585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.059525 kubelet[1946]: E0213 20:12:37.059313 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.059525 kubelet[1946]: E0213 20:12:37.059441 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:37.059738 kubelet[1946]: E0213 20:12:37.059540 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:37.059738 kubelet[1946]: E0213 20:12:37.059629 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:37.060876 kubelet[1946]: E0213 20:12:37.060827 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:37.061164 kubelet[1946]: E0213 20:12:37.061054 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:37.061164 kubelet[1946]: E0213 20:12:37.061096 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:37.061966 kubelet[1946]: E0213 20:12:37.061168 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:37.516425 kubelet[1946]: E0213 20:12:37.516294 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:37.793521 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68-shm.mount: Deactivated successfully. Feb 13 20:12:37.794089 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325-shm.mount: Deactivated successfully. Feb 13 20:12:37.854821 kubelet[1946]: I0213 20:12:37.854753 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68" Feb 13 20:12:37.859293 containerd[1518]: time="2025-02-13T20:12:37.857770158Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:12:37.859293 containerd[1518]: time="2025-02-13T20:12:37.858860128Z" level=info msg="Ensure that sandbox 55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68 in task-service has been cleanup successfully" Feb 13 20:12:37.860915 containerd[1518]: time="2025-02-13T20:12:37.859262692Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:12:37.861002 containerd[1518]: time="2025-02-13T20:12:37.860913263Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:12:37.864540 containerd[1518]: time="2025-02-13T20:12:37.861581146Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:37.864540 containerd[1518]: time="2025-02-13T20:12:37.861731046Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:37.864540 containerd[1518]: time="2025-02-13T20:12:37.861750580Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:37.865762 containerd[1518]: time="2025-02-13T20:12:37.865644301Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:37.865850 containerd[1518]: time="2025-02-13T20:12:37.865821043Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:37.865850 containerd[1518]: time="2025-02-13T20:12:37.865841617Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:37.866281 systemd[1]: run-netns-cni\x2da3515b81\x2db5f0\x2df851\x2d71fc\x2d228615c22107.mount: Deactivated successfully. Feb 13 20:12:37.870270 containerd[1518]: time="2025-02-13T20:12:37.870214524Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:37.870387 containerd[1518]: time="2025-02-13T20:12:37.870350200Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:37.870387 containerd[1518]: time="2025-02-13T20:12:37.870369309Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:37.873112 containerd[1518]: time="2025-02-13T20:12:37.873054531Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:37.873233 containerd[1518]: time="2025-02-13T20:12:37.873205527Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:37.873316 containerd[1518]: time="2025-02-13T20:12:37.873231028Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:37.874525 containerd[1518]: time="2025-02-13T20:12:37.873620132Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:37.874525 containerd[1518]: time="2025-02-13T20:12:37.873968393Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:37.874525 containerd[1518]: time="2025-02-13T20:12:37.873997652Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:37.874820 containerd[1518]: time="2025-02-13T20:12:37.874699113Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:12:37.876038 containerd[1518]: time="2025-02-13T20:12:37.875820642Z" level=info msg="Ensure that sandbox bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325 in task-service has been cleanup successfully" Feb 13 20:12:37.876185 kubelet[1946]: I0213 20:12:37.875921 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325" Feb 13 20:12:37.876272 containerd[1518]: time="2025-02-13T20:12:37.876182975Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:12:37.876272 containerd[1518]: time="2025-02-13T20:12:37.876204704Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:12:37.879220 systemd[1]: run-netns-cni\x2dc7c5d796\x2da1f2\x2de07a\x2d7ecd\x2dd64a9d965072.mount: Deactivated successfully. Feb 13 20:12:37.879826 containerd[1518]: time="2025-02-13T20:12:37.879791443Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:37.879926 containerd[1518]: time="2025-02-13T20:12:37.879900482Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:37.880010 containerd[1518]: time="2025-02-13T20:12:37.879925307Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:37.880086 containerd[1518]: time="2025-02-13T20:12:37.880019350Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:37.880159 containerd[1518]: time="2025-02-13T20:12:37.880133827Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:37.880220 containerd[1518]: time="2025-02-13T20:12:37.880158381Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:37.883771 containerd[1518]: time="2025-02-13T20:12:37.883569388Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:37.883771 containerd[1518]: time="2025-02-13T20:12:37.883681942Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:37.883771 containerd[1518]: time="2025-02-13T20:12:37.883701356Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:37.884056 containerd[1518]: time="2025-02-13T20:12:37.884012673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:7,}" Feb 13 20:12:37.896533 containerd[1518]: time="2025-02-13T20:12:37.896415772Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:37.897348 containerd[1518]: time="2025-02-13T20:12:37.896604491Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:37.897348 containerd[1518]: time="2025-02-13T20:12:37.896625402Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:37.897779 containerd[1518]: time="2025-02-13T20:12:37.897471091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:4,}" Feb 13 20:12:38.112096 containerd[1518]: time="2025-02-13T20:12:38.111849166Z" level=error msg="Failed to destroy network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.113526 containerd[1518]: time="2025-02-13T20:12:38.113301222Z" level=error msg="Failed to destroy network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.113979 containerd[1518]: time="2025-02-13T20:12:38.113942334Z" level=error msg="encountered an error cleaning up failed sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.114524 containerd[1518]: time="2025-02-13T20:12:38.114357502Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.114671 containerd[1518]: time="2025-02-13T20:12:38.114290120Z" level=error msg="encountered an error cleaning up failed sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.115635 containerd[1518]: time="2025-02-13T20:12:38.114809104Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.115778 kubelet[1946]: E0213 20:12:38.115259 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.115972 kubelet[1946]: E0213 20:12:38.115941 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:38.116182 kubelet[1946]: E0213 20:12:38.116154 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:38.116639 kubelet[1946]: E0213 20:12:38.116247 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:38.116639 kubelet[1946]: E0213 20:12:38.116369 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:38.117598 kubelet[1946]: E0213 20:12:38.115377 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:38.117598 kubelet[1946]: E0213 20:12:38.117359 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:38.117598 kubelet[1946]: E0213 20:12:38.117538 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:38.517297 kubelet[1946]: E0213 20:12:38.517177 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:38.793329 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570-shm.mount: Deactivated successfully. Feb 13 20:12:38.884516 kubelet[1946]: I0213 20:12:38.883275 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570" Feb 13 20:12:38.884749 containerd[1518]: time="2025-02-13T20:12:38.884081234Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:12:38.884749 containerd[1518]: time="2025-02-13T20:12:38.884436554Z" level=info msg="Ensure that sandbox 62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570 in task-service has been cleanup successfully" Feb 13 20:12:38.888042 containerd[1518]: time="2025-02-13T20:12:38.887681871Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:12:38.888042 containerd[1518]: time="2025-02-13T20:12:38.887791680Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:12:38.889109 systemd[1]: run-netns-cni\x2d440c9371\x2d2969\x2d8033\x2d4e52\x2d384ac6ded6e6.mount: Deactivated successfully. Feb 13 20:12:38.891039 containerd[1518]: time="2025-02-13T20:12:38.889983663Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:12:38.891039 containerd[1518]: time="2025-02-13T20:12:38.890122951Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:12:38.891039 containerd[1518]: time="2025-02-13T20:12:38.890142957Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:12:38.891768 containerd[1518]: time="2025-02-13T20:12:38.891651793Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:38.892052 containerd[1518]: time="2025-02-13T20:12:38.891981867Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:38.892351 containerd[1518]: time="2025-02-13T20:12:38.892008694Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:38.892694 kubelet[1946]: I0213 20:12:38.892649 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620" Feb 13 20:12:38.892822 containerd[1518]: time="2025-02-13T20:12:38.892667061Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:38.893003 containerd[1518]: time="2025-02-13T20:12:38.892964842Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:38.893107 containerd[1518]: time="2025-02-13T20:12:38.893083836Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:38.893802 containerd[1518]: time="2025-02-13T20:12:38.893314103Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:12:38.894749 containerd[1518]: time="2025-02-13T20:12:38.894718576Z" level=info msg="Ensure that sandbox 3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620 in task-service has been cleanup successfully" Feb 13 20:12:38.897503 containerd[1518]: time="2025-02-13T20:12:38.895476152Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:12:38.897432 systemd[1]: run-netns-cni\x2d068b9f7e\x2d796c\x2dba88\x2d7867\x2d8ae3a5717db7.mount: Deactivated successfully. Feb 13 20:12:38.898106 containerd[1518]: time="2025-02-13T20:12:38.897731979Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:12:38.898106 containerd[1518]: time="2025-02-13T20:12:38.897904558Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:38.898106 containerd[1518]: time="2025-02-13T20:12:38.898006093Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:38.898106 containerd[1518]: time="2025-02-13T20:12:38.898025074Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899064496Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899174812Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899195854Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899262082Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899362845Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:38.900217 containerd[1518]: time="2025-02-13T20:12:38.899389554Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901305759Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901424177Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901444082Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901560343Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901664187Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:38.901839 containerd[1518]: time="2025-02-13T20:12:38.901681843Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:38.902226 containerd[1518]: time="2025-02-13T20:12:38.902161274Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:38.902300 containerd[1518]: time="2025-02-13T20:12:38.902275728Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:38.902300 containerd[1518]: time="2025-02-13T20:12:38.902293528Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:38.902435 containerd[1518]: time="2025-02-13T20:12:38.902359405Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:38.902532 containerd[1518]: time="2025-02-13T20:12:38.902468105Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:38.902604 containerd[1518]: time="2025-02-13T20:12:38.902533143Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:38.904021 containerd[1518]: time="2025-02-13T20:12:38.903468218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:8,}" Feb 13 20:12:38.908083 containerd[1518]: time="2025-02-13T20:12:38.907392534Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:38.908083 containerd[1518]: time="2025-02-13T20:12:38.907524325Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:38.908083 containerd[1518]: time="2025-02-13T20:12:38.907545035Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:38.913425 containerd[1518]: time="2025-02-13T20:12:38.913347109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:5,}" Feb 13 20:12:39.066650 containerd[1518]: time="2025-02-13T20:12:39.065393168Z" level=error msg="Failed to destroy network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.066650 containerd[1518]: time="2025-02-13T20:12:39.065903840Z" level=error msg="encountered an error cleaning up failed sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.066650 containerd[1518]: time="2025-02-13T20:12:39.065971931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.066940 kubelet[1946]: E0213 20:12:39.066276 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.066940 kubelet[1946]: E0213 20:12:39.066350 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:39.066940 kubelet[1946]: E0213 20:12:39.066381 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:39.067120 kubelet[1946]: E0213 20:12:39.066446 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:39.107963 containerd[1518]: time="2025-02-13T20:12:39.107871475Z" level=error msg="Failed to destroy network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.108942 containerd[1518]: time="2025-02-13T20:12:39.108864176Z" level=error msg="encountered an error cleaning up failed sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.109053 containerd[1518]: time="2025-02-13T20:12:39.108950550Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.110076 kubelet[1946]: E0213 20:12:39.110018 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:39.110167 kubelet[1946]: E0213 20:12:39.110108 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:39.110167 kubelet[1946]: E0213 20:12:39.110149 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:39.110465 kubelet[1946]: E0213 20:12:39.110225 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:39.517816 kubelet[1946]: E0213 20:12:39.517414 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:39.798953 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e-shm.mount: Deactivated successfully. Feb 13 20:12:39.907164 kubelet[1946]: I0213 20:12:39.906241 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c" Feb 13 20:12:39.907877 containerd[1518]: time="2025-02-13T20:12:39.907777201Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:12:39.910368 containerd[1518]: time="2025-02-13T20:12:39.909625052Z" level=info msg="Ensure that sandbox b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c in task-service has been cleanup successfully" Feb 13 20:12:39.910368 containerd[1518]: time="2025-02-13T20:12:39.910286014Z" level=info msg="TearDown network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" successfully" Feb 13 20:12:39.910368 containerd[1518]: time="2025-02-13T20:12:39.910316421Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" returns successfully" Feb 13 20:12:39.911338 containerd[1518]: time="2025-02-13T20:12:39.911184286Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:12:39.911608 containerd[1518]: time="2025-02-13T20:12:39.911457963Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:12:39.911608 containerd[1518]: time="2025-02-13T20:12:39.911545050Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:12:39.912429 containerd[1518]: time="2025-02-13T20:12:39.912072948Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:12:39.912429 containerd[1518]: time="2025-02-13T20:12:39.912218599Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:12:39.912429 containerd[1518]: time="2025-02-13T20:12:39.912237813Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:12:39.912988 containerd[1518]: time="2025-02-13T20:12:39.912851056Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:39.913506 containerd[1518]: time="2025-02-13T20:12:39.913111573Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:39.913506 containerd[1518]: time="2025-02-13T20:12:39.913156350Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:39.913922 containerd[1518]: time="2025-02-13T20:12:39.913732616Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:39.914285 containerd[1518]: time="2025-02-13T20:12:39.913894512Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:39.914285 containerd[1518]: time="2025-02-13T20:12:39.914047177Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:39.914637 systemd[1]: run-netns-cni\x2d4d1a5b59\x2d3828\x2dde41\x2d6840\x2d8b57fdb44e61.mount: Deactivated successfully. Feb 13 20:12:39.915938 containerd[1518]: time="2025-02-13T20:12:39.915622919Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:39.916742 containerd[1518]: time="2025-02-13T20:12:39.916053638Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:39.916742 containerd[1518]: time="2025-02-13T20:12:39.916079412Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:39.919307 containerd[1518]: time="2025-02-13T20:12:39.918365849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:6,}" Feb 13 20:12:39.926358 kubelet[1946]: I0213 20:12:39.926328 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e" Feb 13 20:12:39.928052 containerd[1518]: time="2025-02-13T20:12:39.928018513Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:12:39.930815 containerd[1518]: time="2025-02-13T20:12:39.930782284Z" level=info msg="Ensure that sandbox 853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e in task-service has been cleanup successfully" Feb 13 20:12:39.931162 containerd[1518]: time="2025-02-13T20:12:39.931133405Z" level=info msg="TearDown network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" successfully" Feb 13 20:12:39.931282 containerd[1518]: time="2025-02-13T20:12:39.931257613Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" returns successfully" Feb 13 20:12:39.933384 systemd[1]: run-netns-cni\x2db6873fd5\x2dd870\x2d1309\x2dadb5\x2d63cdaff4d6b1.mount: Deactivated successfully. Feb 13 20:12:39.943171 containerd[1518]: time="2025-02-13T20:12:39.943129921Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:12:39.944166 containerd[1518]: time="2025-02-13T20:12:39.943302797Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:12:39.944660 containerd[1518]: time="2025-02-13T20:12:39.944318208Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:12:39.945549 containerd[1518]: time="2025-02-13T20:12:39.945429668Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:12:39.946689 containerd[1518]: time="2025-02-13T20:12:39.945658992Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:12:39.946689 containerd[1518]: time="2025-02-13T20:12:39.945711629Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:12:39.946875 containerd[1518]: time="2025-02-13T20:12:39.946821778Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:39.946972 containerd[1518]: time="2025-02-13T20:12:39.946928885Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:39.946972 containerd[1518]: time="2025-02-13T20:12:39.946948612Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:39.949162 containerd[1518]: time="2025-02-13T20:12:39.949120890Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:39.950716 containerd[1518]: time="2025-02-13T20:12:39.950595889Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:39.950716 containerd[1518]: time="2025-02-13T20:12:39.950626504Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:39.952006 containerd[1518]: time="2025-02-13T20:12:39.951813008Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:39.952006 containerd[1518]: time="2025-02-13T20:12:39.951919761Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:39.952006 containerd[1518]: time="2025-02-13T20:12:39.951938467Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:39.952721 containerd[1518]: time="2025-02-13T20:12:39.952564421Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:39.953270 containerd[1518]: time="2025-02-13T20:12:39.953098457Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:39.953270 containerd[1518]: time="2025-02-13T20:12:39.953125289Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:39.954013 containerd[1518]: time="2025-02-13T20:12:39.953787415Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:39.954013 containerd[1518]: time="2025-02-13T20:12:39.953894272Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:39.954013 containerd[1518]: time="2025-02-13T20:12:39.953913389Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:39.954737 containerd[1518]: time="2025-02-13T20:12:39.954528365Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:39.954737 containerd[1518]: time="2025-02-13T20:12:39.954634554Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:39.954737 containerd[1518]: time="2025-02-13T20:12:39.954653357Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:39.956296 containerd[1518]: time="2025-02-13T20:12:39.955931976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:9,}" Feb 13 20:12:40.381244 containerd[1518]: time="2025-02-13T20:12:40.381150037Z" level=error msg="Failed to destroy network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.382380 containerd[1518]: time="2025-02-13T20:12:40.382317450Z" level=error msg="encountered an error cleaning up failed sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.382720 containerd[1518]: time="2025-02-13T20:12:40.382651565Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.383718 kubelet[1946]: E0213 20:12:40.383453 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.383718 kubelet[1946]: E0213 20:12:40.383662 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:40.384561 kubelet[1946]: E0213 20:12:40.383732 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:40.384561 kubelet[1946]: E0213 20:12:40.383875 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:40.426129 containerd[1518]: time="2025-02-13T20:12:40.425922310Z" level=error msg="Failed to destroy network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.427253 containerd[1518]: time="2025-02-13T20:12:40.426863698Z" level=error msg="encountered an error cleaning up failed sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.427253 containerd[1518]: time="2025-02-13T20:12:40.426953419Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.427461 kubelet[1946]: E0213 20:12:40.427298 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:40.427461 kubelet[1946]: E0213 20:12:40.427409 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:40.427461 kubelet[1946]: E0213 20:12:40.427441 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:40.427901 kubelet[1946]: E0213 20:12:40.427552 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:40.517781 kubelet[1946]: E0213 20:12:40.517650 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:40.793025 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b-shm.mount: Deactivated successfully. Feb 13 20:12:40.793210 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d-shm.mount: Deactivated successfully. Feb 13 20:12:40.793327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1810837035.mount: Deactivated successfully. Feb 13 20:12:40.820942 containerd[1518]: time="2025-02-13T20:12:40.820839363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:40.822136 containerd[1518]: time="2025-02-13T20:12:40.822066536Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 20:12:40.822996 containerd[1518]: time="2025-02-13T20:12:40.822957831Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:40.826342 containerd[1518]: time="2025-02-13T20:12:40.826232419Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:40.827919 containerd[1518]: time="2025-02-13T20:12:40.827129608Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 10.062261547s" Feb 13 20:12:40.827919 containerd[1518]: time="2025-02-13T20:12:40.827183745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 20:12:40.866755 containerd[1518]: time="2025-02-13T20:12:40.866675238Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 20:12:40.900200 containerd[1518]: time="2025-02-13T20:12:40.900038158Z" level=info msg="CreateContainer within sandbox \"f1f4f3b635ed84bd0c3644f3a22bbb3047f42a32283cfdbb44025bbdcebb76e8\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"dda489273a403158a68a92aa95cd78b094e2ed9e3e39b3ec8ffc9fe8a959f21d\"" Feb 13 20:12:40.901518 containerd[1518]: time="2025-02-13T20:12:40.901397037Z" level=info msg="StartContainer for \"dda489273a403158a68a92aa95cd78b094e2ed9e3e39b3ec8ffc9fe8a959f21d\"" Feb 13 20:12:40.942855 kubelet[1946]: I0213 20:12:40.942807 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b" Feb 13 20:12:40.945342 containerd[1518]: time="2025-02-13T20:12:40.944667500Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" Feb 13 20:12:40.945342 containerd[1518]: time="2025-02-13T20:12:40.945018211Z" level=info msg="Ensure that sandbox 421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b in task-service has been cleanup successfully" Feb 13 20:12:40.948565 containerd[1518]: time="2025-02-13T20:12:40.948453344Z" level=info msg="TearDown network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" successfully" Feb 13 20:12:40.949180 containerd[1518]: time="2025-02-13T20:12:40.949129587Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" returns successfully" Feb 13 20:12:40.951018 containerd[1518]: time="2025-02-13T20:12:40.950626679Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:12:40.951018 containerd[1518]: time="2025-02-13T20:12:40.950752482Z" level=info msg="TearDown network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" successfully" Feb 13 20:12:40.951018 containerd[1518]: time="2025-02-13T20:12:40.950772837Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" returns successfully" Feb 13 20:12:40.951966 containerd[1518]: time="2025-02-13T20:12:40.951804569Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:12:40.952166 containerd[1518]: time="2025-02-13T20:12:40.951936875Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:12:40.952166 containerd[1518]: time="2025-02-13T20:12:40.952108134Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:12:40.953074 containerd[1518]: time="2025-02-13T20:12:40.953036980Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:12:40.954999 containerd[1518]: time="2025-02-13T20:12:40.953384502Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:12:40.954999 containerd[1518]: time="2025-02-13T20:12:40.953414041Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:12:40.955142 kubelet[1946]: I0213 20:12:40.953885 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d" Feb 13 20:12:40.955916 containerd[1518]: time="2025-02-13T20:12:40.955885824Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" Feb 13 20:12:40.956317 containerd[1518]: time="2025-02-13T20:12:40.956285948Z" level=info msg="Ensure that sandbox 655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d in task-service has been cleanup successfully" Feb 13 20:12:40.956587 containerd[1518]: time="2025-02-13T20:12:40.956437814Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:40.956876 containerd[1518]: time="2025-02-13T20:12:40.956841389Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:40.957004 containerd[1518]: time="2025-02-13T20:12:40.956980272Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:40.957666 containerd[1518]: time="2025-02-13T20:12:40.957636108Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:40.958168 containerd[1518]: time="2025-02-13T20:12:40.958139761Z" level=info msg="TearDown network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" successfully" Feb 13 20:12:40.958349 containerd[1518]: time="2025-02-13T20:12:40.958323868Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" returns successfully" Feb 13 20:12:40.958586 containerd[1518]: time="2025-02-13T20:12:40.958294442Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:40.958586 containerd[1518]: time="2025-02-13T20:12:40.958525000Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959335307Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959468251Z" level=info msg="TearDown network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" successfully" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959510515Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" returns successfully" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959615665Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959752212Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:40.960237 containerd[1518]: time="2025-02-13T20:12:40.959772790Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:40.961340 containerd[1518]: time="2025-02-13T20:12:40.961037981Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:40.961616 containerd[1518]: time="2025-02-13T20:12:40.961539241Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:12:40.961828 containerd[1518]: time="2025-02-13T20:12:40.961616781Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:40.961828 containerd[1518]: time="2025-02-13T20:12:40.961645539Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:40.962233 containerd[1518]: time="2025-02-13T20:12:40.961989009Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:12:40.962233 containerd[1518]: time="2025-02-13T20:12:40.962015655Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:12:40.963074 containerd[1518]: time="2025-02-13T20:12:40.962775295Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:40.963337 containerd[1518]: time="2025-02-13T20:12:40.963184100Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:40.963337 containerd[1518]: time="2025-02-13T20:12:40.963277650Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:40.964204 containerd[1518]: time="2025-02-13T20:12:40.963971855Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:12:40.964642 containerd[1518]: time="2025-02-13T20:12:40.964101945Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:12:40.964642 containerd[1518]: time="2025-02-13T20:12:40.964412259Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:12:40.964642 containerd[1518]: time="2025-02-13T20:12:40.964428814Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:40.964642 containerd[1518]: time="2025-02-13T20:12:40.964568369Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:40.964642 containerd[1518]: time="2025-02-13T20:12:40.964615695Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:40.967120 containerd[1518]: time="2025-02-13T20:12:40.966433131Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:40.967120 containerd[1518]: time="2025-02-13T20:12:40.966553117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:10,}" Feb 13 20:12:40.967120 containerd[1518]: time="2025-02-13T20:12:40.966617832Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:40.967120 containerd[1518]: time="2025-02-13T20:12:40.966639107Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:40.968043 containerd[1518]: time="2025-02-13T20:12:40.967330221Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:40.968516 containerd[1518]: time="2025-02-13T20:12:40.968161741Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:40.968516 containerd[1518]: time="2025-02-13T20:12:40.968188505Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:40.970660 containerd[1518]: time="2025-02-13T20:12:40.970042262Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:40.970660 containerd[1518]: time="2025-02-13T20:12:40.970163897Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:40.970660 containerd[1518]: time="2025-02-13T20:12:40.970182674Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:40.971448 containerd[1518]: time="2025-02-13T20:12:40.971130384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:7,}" Feb 13 20:12:41.038776 systemd[1]: Started cri-containerd-dda489273a403158a68a92aa95cd78b094e2ed9e3e39b3ec8ffc9fe8a959f21d.scope - libcontainer container dda489273a403158a68a92aa95cd78b094e2ed9e3e39b3ec8ffc9fe8a959f21d. Feb 13 20:12:41.143546 containerd[1518]: time="2025-02-13T20:12:41.141046663Z" level=info msg="StartContainer for \"dda489273a403158a68a92aa95cd78b094e2ed9e3e39b3ec8ffc9fe8a959f21d\" returns successfully" Feb 13 20:12:41.159878 containerd[1518]: time="2025-02-13T20:12:41.159799281Z" level=error msg="Failed to destroy network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.161660 containerd[1518]: time="2025-02-13T20:12:41.161102044Z" level=error msg="encountered an error cleaning up failed sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.161660 containerd[1518]: time="2025-02-13T20:12:41.161205692Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.161660 containerd[1518]: time="2025-02-13T20:12:41.161537647Z" level=error msg="Failed to destroy network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.162536 kubelet[1946]: E0213 20:12:41.162090 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.162536 kubelet[1946]: E0213 20:12:41.162199 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:41.162536 kubelet[1946]: E0213 20:12:41.162272 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-m6z8q" Feb 13 20:12:41.162753 containerd[1518]: time="2025-02-13T20:12:41.162160624Z" level=error msg="encountered an error cleaning up failed sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.162753 containerd[1518]: time="2025-02-13T20:12:41.162247261Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:10,} failed, error" error="failed to setup network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.164227 kubelet[1946]: E0213 20:12:41.163856 1946 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 20:12:41.164227 kubelet[1946]: E0213 20:12:41.163916 1946 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:41.164227 kubelet[1946]: E0213 20:12:41.163949 1946 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fjz82" Feb 13 20:12:41.164436 kubelet[1946]: E0213 20:12:41.164021 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fjz82_calico-system(d5b9c20a-bc22-40d8-8d45-69955889fccc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fjz82" podUID="d5b9c20a-bc22-40d8-8d45-69955889fccc" Feb 13 20:12:41.164436 kubelet[1946]: E0213 20:12:41.162421 1946 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-m6z8q_default(3ec74cf6-af99-4f93-a98f-a1049d5e3987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-m6z8q" podUID="3ec74cf6-af99-4f93-a98f-a1049d5e3987" Feb 13 20:12:41.253679 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 20:12:41.253897 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 20:12:41.518522 kubelet[1946]: E0213 20:12:41.518400 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:41.793657 systemd[1]: run-netns-cni\x2d32c1e11a\x2d5260\x2d2d93\x2d8238\x2d4014e732e4fd.mount: Deactivated successfully. Feb 13 20:12:41.793840 systemd[1]: run-netns-cni\x2d21dd26da\x2d901b\x2d49b7\x2d7eb4\x2d958560f12e70.mount: Deactivated successfully. Feb 13 20:12:41.965448 kubelet[1946]: I0213 20:12:41.965350 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460" Feb 13 20:12:41.966460 containerd[1518]: time="2025-02-13T20:12:41.966393254Z" level=info msg="StopPodSandbox for \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\"" Feb 13 20:12:41.967632 containerd[1518]: time="2025-02-13T20:12:41.967273446Z" level=info msg="Ensure that sandbox 9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460 in task-service has been cleanup successfully" Feb 13 20:12:41.968422 containerd[1518]: time="2025-02-13T20:12:41.968269148Z" level=info msg="TearDown network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" successfully" Feb 13 20:12:41.968422 containerd[1518]: time="2025-02-13T20:12:41.968298542Z" level=info msg="StopPodSandbox for \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" returns successfully" Feb 13 20:12:41.970613 containerd[1518]: time="2025-02-13T20:12:41.969531423Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" Feb 13 20:12:41.971009 containerd[1518]: time="2025-02-13T20:12:41.970827383Z" level=info msg="TearDown network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" successfully" Feb 13 20:12:41.971009 containerd[1518]: time="2025-02-13T20:12:41.970857492Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" returns successfully" Feb 13 20:12:41.971288 systemd[1]: run-netns-cni\x2d633e4985\x2d7f9e\x2d9651\x2d03c4\x2d95dff4aee399.mount: Deactivated successfully. Feb 13 20:12:41.971981 containerd[1518]: time="2025-02-13T20:12:41.971934047Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:12:41.972386 containerd[1518]: time="2025-02-13T20:12:41.972038339Z" level=info msg="TearDown network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" successfully" Feb 13 20:12:41.972386 containerd[1518]: time="2025-02-13T20:12:41.972064493Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" returns successfully" Feb 13 20:12:41.973234 containerd[1518]: time="2025-02-13T20:12:41.973036712Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:12:41.973234 containerd[1518]: time="2025-02-13T20:12:41.973143196Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:12:41.973234 containerd[1518]: time="2025-02-13T20:12:41.973163241Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:12:41.974407 containerd[1518]: time="2025-02-13T20:12:41.974232047Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:12:41.974407 containerd[1518]: time="2025-02-13T20:12:41.974338153Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:12:41.974407 containerd[1518]: time="2025-02-13T20:12:41.974356732Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:12:41.975176 containerd[1518]: time="2025-02-13T20:12:41.974955790Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:12:41.975759 containerd[1518]: time="2025-02-13T20:12:41.975306148Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:12:41.975759 containerd[1518]: time="2025-02-13T20:12:41.975709330Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:12:41.976569 containerd[1518]: time="2025-02-13T20:12:41.976307375Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:12:41.976569 containerd[1518]: time="2025-02-13T20:12:41.976414381Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:12:41.976569 containerd[1518]: time="2025-02-13T20:12:41.976432798Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:12:41.977587 containerd[1518]: time="2025-02-13T20:12:41.977179810Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:12:41.977587 containerd[1518]: time="2025-02-13T20:12:41.977346873Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:12:41.977587 containerd[1518]: time="2025-02-13T20:12:41.977366604Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:12:41.978193 containerd[1518]: time="2025-02-13T20:12:41.978159650Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:12:41.979193 containerd[1518]: time="2025-02-13T20:12:41.978270415Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:12:41.979193 containerd[1518]: time="2025-02-13T20:12:41.978295679Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:12:41.979333 kubelet[1946]: I0213 20:12:41.978387 1946 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8" Feb 13 20:12:41.979807 containerd[1518]: time="2025-02-13T20:12:41.979594821Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:12:41.979807 containerd[1518]: time="2025-02-13T20:12:41.979636829Z" level=info msg="StopPodSandbox for \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\"" Feb 13 20:12:41.979807 containerd[1518]: time="2025-02-13T20:12:41.979721168Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:12:41.979807 containerd[1518]: time="2025-02-13T20:12:41.979741022Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:12:41.980075 containerd[1518]: time="2025-02-13T20:12:41.979872293Z" level=info msg="Ensure that sandbox c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8 in task-service has been cleanup successfully" Feb 13 20:12:41.980137 containerd[1518]: time="2025-02-13T20:12:41.980095789Z" level=info msg="TearDown network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" successfully" Feb 13 20:12:41.982510 containerd[1518]: time="2025-02-13T20:12:41.980550426Z" level=info msg="StopPodSandbox for \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" returns successfully" Feb 13 20:12:41.982712 containerd[1518]: time="2025-02-13T20:12:41.982650762Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" Feb 13 20:12:41.982800 containerd[1518]: time="2025-02-13T20:12:41.982777680Z" level=info msg="TearDown network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" successfully" Feb 13 20:12:41.982848 containerd[1518]: time="2025-02-13T20:12:41.982797678Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" returns successfully" Feb 13 20:12:41.983207 containerd[1518]: time="2025-02-13T20:12:41.983161792Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:12:41.983327 containerd[1518]: time="2025-02-13T20:12:41.983302058Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:12:41.983403 containerd[1518]: time="2025-02-13T20:12:41.983326330Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:12:41.983860 containerd[1518]: time="2025-02-13T20:12:41.983825192Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:12:41.984050 containerd[1518]: time="2025-02-13T20:12:41.984019430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:11,}" Feb 13 20:12:41.984161 systemd[1]: run-netns-cni\x2ddd913b96\x2de4e1\x2debb6\x2df1b0\x2dad30944d8200.mount: Deactivated successfully. Feb 13 20:12:41.985114 containerd[1518]: time="2025-02-13T20:12:41.985081600Z" level=info msg="TearDown network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" successfully" Feb 13 20:12:41.985204 containerd[1518]: time="2025-02-13T20:12:41.985110165Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" returns successfully" Feb 13 20:12:41.986525 containerd[1518]: time="2025-02-13T20:12:41.985863002Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:12:41.986525 containerd[1518]: time="2025-02-13T20:12:41.986178099Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:12:41.986525 containerd[1518]: time="2025-02-13T20:12:41.986374037Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:12:41.988204 containerd[1518]: time="2025-02-13T20:12:41.987834770Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:12:41.988204 containerd[1518]: time="2025-02-13T20:12:41.987941346Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:12:41.988204 containerd[1518]: time="2025-02-13T20:12:41.987962221Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:12:41.988407 kubelet[1946]: I0213 20:12:41.987904 1946 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-79pxv" podStartSLOduration=4.053795789 podStartE2EDuration="26.987865317s" podCreationTimestamp="2025-02-13 20:12:15 +0000 UTC" firstStartedPulling="2025-02-13 20:12:17.894595528 +0000 UTC m=+3.618349380" lastFinishedPulling="2025-02-13 20:12:40.82866506 +0000 UTC m=+26.552418908" observedRunningTime="2025-02-13 20:12:41.987140962 +0000 UTC m=+27.710894833" watchObservedRunningTime="2025-02-13 20:12:41.987865317 +0000 UTC m=+27.711619184" Feb 13 20:12:41.989076 containerd[1518]: time="2025-02-13T20:12:41.988949928Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:12:41.989506 containerd[1518]: time="2025-02-13T20:12:41.989458448Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:12:41.989823 containerd[1518]: time="2025-02-13T20:12:41.989514756Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:12:41.989879 containerd[1518]: time="2025-02-13T20:12:41.989823020Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:12:41.990128 containerd[1518]: time="2025-02-13T20:12:41.989921941Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:12:41.990128 containerd[1518]: time="2025-02-13T20:12:41.989947150Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:12:41.990974 containerd[1518]: time="2025-02-13T20:12:41.990737390Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:12:41.990974 containerd[1518]: time="2025-02-13T20:12:41.990840946Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:12:41.990974 containerd[1518]: time="2025-02-13T20:12:41.990858878Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:12:41.992517 containerd[1518]: time="2025-02-13T20:12:41.992388746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:8,}" Feb 13 20:12:42.329059 systemd-networkd[1441]: calieb8861681ac: Link UP Feb 13 20:12:42.332366 systemd-networkd[1441]: calieb8861681ac: Gained carrier Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.075 [INFO][3177] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.089 [INFO][3177] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0 nginx-deployment-8587fbcb89- default 3ec74cf6-af99-4f93-a98f-a1049d5e3987 1167 0 2025-02-13 20:12:33 +0000 UTC map[app:nginx pod-template-hash:8587fbcb89 projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.244.13.70 nginx-deployment-8587fbcb89-m6z8q eth0 default [] [] [kns.default ksa.default.default] calieb8861681ac [] []}} ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.089 [INFO][3177] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.146 [INFO][3188] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" HandleID="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Workload="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.262 [INFO][3188] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" HandleID="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Workload="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319a20), Attrs:map[string]string{"namespace":"default", "node":"10.244.13.70", "pod":"nginx-deployment-8587fbcb89-m6z8q", "timestamp":"2025-02-13 20:12:42.146870192 +0000 UTC"}, Hostname:"10.244.13.70", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.262 [INFO][3188] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.263 [INFO][3188] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.263 [INFO][3188] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.244.13.70' Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.268 [INFO][3188] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.275 [INFO][3188] ipam/ipam.go 372: Looking up existing affinities for host host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.281 [INFO][3188] ipam/ipam.go 489: Trying affinity for 192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.284 [INFO][3188] ipam/ipam.go 155: Attempting to load block cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.287 [INFO][3188] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.287 [INFO][3188] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.56.128/26 handle="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.289 [INFO][3188] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0 Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.298 [INFO][3188] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.56.128/26 handle="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3188] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.56.129/26] block=192.168.56.128/26 handle="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3188] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.56.129/26] handle="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" host="10.244.13.70" Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3188] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 20:12:42.346176 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3188] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.129/26] IPv6=[] ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" HandleID="k8s-pod-network.dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Workload="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.308 [INFO][3177] cni-plugin/k8s.go 386: Populated endpoint ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0", GenerateName:"nginx-deployment-8587fbcb89-", Namespace:"default", SelfLink:"", UID:"3ec74cf6-af99-4f93-a98f-a1049d5e3987", ResourceVersion:"1167", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 12, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"8587fbcb89", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"", Pod:"nginx-deployment-8587fbcb89-m6z8q", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.56.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"calieb8861681ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.308 [INFO][3177] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.56.129/32] ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.308 [INFO][3177] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb8861681ac ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.333 [INFO][3177] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.334 [INFO][3177] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0", GenerateName:"nginx-deployment-8587fbcb89-", Namespace:"default", SelfLink:"", UID:"3ec74cf6-af99-4f93-a98f-a1049d5e3987", ResourceVersion:"1167", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 12, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"8587fbcb89", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0", Pod:"nginx-deployment-8587fbcb89-m6z8q", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.56.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"calieb8861681ac", MAC:"52:30:5a:10:e8:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:12:42.347281 containerd[1518]: 2025-02-13 20:12:42.343 [INFO][3177] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0" Namespace="default" Pod="nginx-deployment-8587fbcb89-m6z8q" WorkloadEndpoint="10.244.13.70-k8s-nginx--deployment--8587fbcb89--m6z8q-eth0" Feb 13 20:12:42.381144 containerd[1518]: time="2025-02-13T20:12:42.379320592Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:12:42.381144 containerd[1518]: time="2025-02-13T20:12:42.379466499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:12:42.381144 containerd[1518]: time="2025-02-13T20:12:42.379663869Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:42.381144 containerd[1518]: time="2025-02-13T20:12:42.379819886Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:42.412790 systemd[1]: Started cri-containerd-dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0.scope - libcontainer container dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0. Feb 13 20:12:42.431785 systemd-networkd[1441]: cali7357dbb829b: Link UP Feb 13 20:12:42.433636 systemd-networkd[1441]: cali7357dbb829b: Gained carrier Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.043 [INFO][3166] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.089 [INFO][3166] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.244.13.70-k8s-csi--node--driver--fjz82-eth0 csi-node-driver- calico-system d5b9c20a-bc22-40d8-8d45-69955889fccc 1068 0 2025-02-13 20:12:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 10.244.13.70 csi-node-driver-fjz82 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7357dbb829b [] []}} ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.089 [INFO][3166] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.147 [INFO][3193] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" HandleID="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Workload="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.263 [INFO][3193] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" HandleID="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Workload="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000312b60), Attrs:map[string]string{"namespace":"calico-system", "node":"10.244.13.70", "pod":"csi-node-driver-fjz82", "timestamp":"2025-02-13 20:12:42.147543765 +0000 UTC"}, Hostname:"10.244.13.70", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.263 [INFO][3193] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3193] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.304 [INFO][3193] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.244.13.70' Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.370 [INFO][3193] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.377 [INFO][3193] ipam/ipam.go 372: Looking up existing affinities for host host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.385 [INFO][3193] ipam/ipam.go 489: Trying affinity for 192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.387 [INFO][3193] ipam/ipam.go 155: Attempting to load block cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.391 [INFO][3193] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.391 [INFO][3193] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.56.128/26 handle="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.393 [INFO][3193] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9 Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.404 [INFO][3193] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.56.128/26 handle="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.422 [INFO][3193] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.56.130/26] block=192.168.56.128/26 handle="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.423 [INFO][3193] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.56.130/26] handle="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" host="10.244.13.70" Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.423 [INFO][3193] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 20:12:42.462970 containerd[1518]: 2025-02-13 20:12:42.423 [INFO][3193] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.130/26] IPv6=[] ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" HandleID="k8s-pod-network.234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Workload="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.426 [INFO][3166] cni-plugin/k8s.go 386: Populated endpoint ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-csi--node--driver--fjz82-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d5b9c20a-bc22-40d8-8d45-69955889fccc", ResourceVersion:"1068", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 12, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"", Pod:"csi-node-driver-fjz82", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7357dbb829b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.426 [INFO][3166] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.56.130/32] ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.426 [INFO][3166] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7357dbb829b ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.436 [INFO][3166] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.439 [INFO][3166] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-csi--node--driver--fjz82-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d5b9c20a-bc22-40d8-8d45-69955889fccc", ResourceVersion:"1068", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 12, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9", Pod:"csi-node-driver-fjz82", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7357dbb829b", MAC:"7e:ff:7d:08:a3:80", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:12:42.465124 containerd[1518]: 2025-02-13 20:12:42.459 [INFO][3166] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9" Namespace="calico-system" Pod="csi-node-driver-fjz82" WorkloadEndpoint="10.244.13.70-k8s-csi--node--driver--fjz82-eth0" Feb 13 20:12:42.492671 containerd[1518]: time="2025-02-13T20:12:42.492552707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-m6z8q,Uid:3ec74cf6-af99-4f93-a98f-a1049d5e3987,Namespace:default,Attempt:8,} returns sandbox id \"dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0\"" Feb 13 20:12:42.497509 containerd[1518]: time="2025-02-13T20:12:42.497244446Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 20:12:42.502407 containerd[1518]: time="2025-02-13T20:12:42.502314194Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:12:42.502625 containerd[1518]: time="2025-02-13T20:12:42.502583537Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:12:42.502803 containerd[1518]: time="2025-02-13T20:12:42.502764531Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:42.503041 containerd[1518]: time="2025-02-13T20:12:42.502991621Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:12:42.520150 kubelet[1946]: E0213 20:12:42.518720 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:42.526697 systemd[1]: Started cri-containerd-234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9.scope - libcontainer container 234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9. Feb 13 20:12:42.558779 containerd[1518]: time="2025-02-13T20:12:42.558711908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fjz82,Uid:d5b9c20a-bc22-40d8-8d45-69955889fccc,Namespace:calico-system,Attempt:11,} returns sandbox id \"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9\"" Feb 13 20:12:43.034538 kernel: bpftool[3420]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 20:12:43.361299 systemd-networkd[1441]: vxlan.calico: Link UP Feb 13 20:12:43.361311 systemd-networkd[1441]: vxlan.calico: Gained carrier Feb 13 20:12:43.510012 systemd-networkd[1441]: cali7357dbb829b: Gained IPv6LL Feb 13 20:12:43.519843 kubelet[1946]: E0213 20:12:43.519747 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:43.703691 systemd-networkd[1441]: calieb8861681ac: Gained IPv6LL Feb 13 20:12:44.520785 kubelet[1946]: E0213 20:12:44.520705 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:45.048690 systemd-networkd[1441]: vxlan.calico: Gained IPv6LL Feb 13 20:12:45.521577 kubelet[1946]: E0213 20:12:45.521374 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:45.642329 kubelet[1946]: I0213 20:12:45.641364 1946 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 20:12:46.522089 kubelet[1946]: E0213 20:12:46.521979 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:46.523198 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3306825186.mount: Deactivated successfully. Feb 13 20:12:47.523110 kubelet[1946]: E0213 20:12:47.523029 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:48.524118 kubelet[1946]: E0213 20:12:48.524025 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:49.525405 kubelet[1946]: E0213 20:12:49.525269 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:50.526641 kubelet[1946]: E0213 20:12:50.526508 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:50.849786 containerd[1518]: time="2025-02-13T20:12:50.848029193Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:50.849786 containerd[1518]: time="2025-02-13T20:12:50.849315161Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=73054493" Feb 13 20:12:50.850998 containerd[1518]: time="2025-02-13T20:12:50.850960965Z" level=info msg="ImageCreate event name:\"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:50.855032 containerd[1518]: time="2025-02-13T20:12:50.854997754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:50.856620 containerd[1518]: time="2025-02-13T20:12:50.856565513Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 8.359272509s" Feb 13 20:12:50.856709 containerd[1518]: time="2025-02-13T20:12:50.856635093Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 20:12:50.859883 containerd[1518]: time="2025-02-13T20:12:50.859637004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 20:12:50.862117 containerd[1518]: time="2025-02-13T20:12:50.861742266Z" level=info msg="CreateContainer within sandbox \"dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Feb 13 20:12:50.897965 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1265839601.mount: Deactivated successfully. Feb 13 20:12:50.900206 containerd[1518]: time="2025-02-13T20:12:50.900008743Z" level=info msg="CreateContainer within sandbox \"dd6b1554bfaa2ae391c116dbb6d153bd368883595938e5d9b8a0fcb9ac6448f0\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"30cbc528fe86360505ab0373227aa51acad4bfd337b8b3623659d96a10d3aac2\"" Feb 13 20:12:50.901156 containerd[1518]: time="2025-02-13T20:12:50.901121078Z" level=info msg="StartContainer for \"30cbc528fe86360505ab0373227aa51acad4bfd337b8b3623659d96a10d3aac2\"" Feb 13 20:12:50.955826 systemd[1]: Started cri-containerd-30cbc528fe86360505ab0373227aa51acad4bfd337b8b3623659d96a10d3aac2.scope - libcontainer container 30cbc528fe86360505ab0373227aa51acad4bfd337b8b3623659d96a10d3aac2. Feb 13 20:12:50.994284 containerd[1518]: time="2025-02-13T20:12:50.994119805Z" level=info msg="StartContainer for \"30cbc528fe86360505ab0373227aa51acad4bfd337b8b3623659d96a10d3aac2\" returns successfully" Feb 13 20:12:51.057587 kubelet[1946]: I0213 20:12:51.057413 1946 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-8587fbcb89-m6z8q" podStartSLOduration=9.695402314 podStartE2EDuration="18.05737763s" podCreationTimestamp="2025-02-13 20:12:33 +0000 UTC" firstStartedPulling="2025-02-13 20:12:42.496374732 +0000 UTC m=+28.220128584" lastFinishedPulling="2025-02-13 20:12:50.858350053 +0000 UTC m=+36.582103900" observedRunningTime="2025-02-13 20:12:51.057140823 +0000 UTC m=+36.780894672" watchObservedRunningTime="2025-02-13 20:12:51.05737763 +0000 UTC m=+36.781131491" Feb 13 20:12:51.527852 kubelet[1946]: E0213 20:12:51.527746 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:52.528278 kubelet[1946]: E0213 20:12:52.528214 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:52.580536 containerd[1518]: time="2025-02-13T20:12:52.579283833Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:52.580536 containerd[1518]: time="2025-02-13T20:12:52.580513361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 20:12:52.581857 containerd[1518]: time="2025-02-13T20:12:52.581781084Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:52.585607 containerd[1518]: time="2025-02-13T20:12:52.585513415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:52.586854 containerd[1518]: time="2025-02-13T20:12:52.586453805Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.726772327s" Feb 13 20:12:52.586854 containerd[1518]: time="2025-02-13T20:12:52.586514135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 20:12:52.589247 containerd[1518]: time="2025-02-13T20:12:52.589212932Z" level=info msg="CreateContainer within sandbox \"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 20:12:52.635795 containerd[1518]: time="2025-02-13T20:12:52.635724441Z" level=info msg="CreateContainer within sandbox \"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e4c2f0fc912cac7ba8bc2b1bd6092b5a3c514021899539ee370adce730353c5e\"" Feb 13 20:12:52.636661 containerd[1518]: time="2025-02-13T20:12:52.636595121Z" level=info msg="StartContainer for \"e4c2f0fc912cac7ba8bc2b1bd6092b5a3c514021899539ee370adce730353c5e\"" Feb 13 20:12:52.687875 systemd[1]: Started cri-containerd-e4c2f0fc912cac7ba8bc2b1bd6092b5a3c514021899539ee370adce730353c5e.scope - libcontainer container e4c2f0fc912cac7ba8bc2b1bd6092b5a3c514021899539ee370adce730353c5e. Feb 13 20:12:52.735470 containerd[1518]: time="2025-02-13T20:12:52.735411840Z" level=info msg="StartContainer for \"e4c2f0fc912cac7ba8bc2b1bd6092b5a3c514021899539ee370adce730353c5e\" returns successfully" Feb 13 20:12:52.737687 containerd[1518]: time="2025-02-13T20:12:52.737654209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 20:12:53.529052 kubelet[1946]: E0213 20:12:53.528977 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:54.487441 containerd[1518]: time="2025-02-13T20:12:54.487216075Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:54.489181 containerd[1518]: time="2025-02-13T20:12:54.488649327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 20:12:54.491530 containerd[1518]: time="2025-02-13T20:12:54.489639593Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:54.493313 containerd[1518]: time="2025-02-13T20:12:54.493260356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:12:54.495640 containerd[1518]: time="2025-02-13T20:12:54.495586323Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.757829127s" Feb 13 20:12:54.495640 containerd[1518]: time="2025-02-13T20:12:54.495635988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 20:12:54.498272 containerd[1518]: time="2025-02-13T20:12:54.498220601Z" level=info msg="CreateContainer within sandbox \"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 20:12:54.517255 containerd[1518]: time="2025-02-13T20:12:54.517074809Z" level=info msg="CreateContainer within sandbox \"234ad3bedac238ed10b6f9d43c09cfeb7f428dc173602d461d05200d946319c9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"481fb3294c6f504f5fdedc5c401fba35a6dd97faf5036d72032e9c168bc287f4\"" Feb 13 20:12:54.518421 containerd[1518]: time="2025-02-13T20:12:54.517848806Z" level=info msg="StartContainer for \"481fb3294c6f504f5fdedc5c401fba35a6dd97faf5036d72032e9c168bc287f4\"" Feb 13 20:12:54.529931 kubelet[1946]: E0213 20:12:54.529888 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:54.572900 systemd[1]: Started cri-containerd-481fb3294c6f504f5fdedc5c401fba35a6dd97faf5036d72032e9c168bc287f4.scope - libcontainer container 481fb3294c6f504f5fdedc5c401fba35a6dd97faf5036d72032e9c168bc287f4. Feb 13 20:12:54.623305 containerd[1518]: time="2025-02-13T20:12:54.623092344Z" level=info msg="StartContainer for \"481fb3294c6f504f5fdedc5c401fba35a6dd97faf5036d72032e9c168bc287f4\" returns successfully" Feb 13 20:12:54.653022 kubelet[1946]: I0213 20:12:54.652819 1946 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 20:12:54.653022 kubelet[1946]: I0213 20:12:54.652895 1946 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 20:12:55.492848 kubelet[1946]: E0213 20:12:55.492750 1946 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:55.531692 kubelet[1946]: E0213 20:12:55.531608 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:56.532898 kubelet[1946]: E0213 20:12:56.532754 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:57.533322 kubelet[1946]: E0213 20:12:57.533247 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:58.533928 kubelet[1946]: E0213 20:12:58.533856 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:12:59.535212 kubelet[1946]: E0213 20:12:59.535103 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:00.536133 kubelet[1946]: E0213 20:13:00.536046 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:01.536926 kubelet[1946]: E0213 20:13:01.536815 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:02.537216 kubelet[1946]: E0213 20:13:02.537105 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:02.828681 kubelet[1946]: I0213 20:13:02.828428 1946 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fjz82" podStartSLOduration=35.892869165 podStartE2EDuration="47.82837959s" podCreationTimestamp="2025-02-13 20:12:15 +0000 UTC" firstStartedPulling="2025-02-13 20:12:42.56091296 +0000 UTC m=+28.284666814" lastFinishedPulling="2025-02-13 20:12:54.496423387 +0000 UTC m=+40.220177239" observedRunningTime="2025-02-13 20:12:55.084799817 +0000 UTC m=+40.808553689" watchObservedRunningTime="2025-02-13 20:13:02.82837959 +0000 UTC m=+48.552133446" Feb 13 20:13:02.838355 systemd[1]: Created slice kubepods-besteffort-pod995b5376_5e11_4222_b413_306ba18b9520.slice - libcontainer container kubepods-besteffort-pod995b5376_5e11_4222_b413_306ba18b9520.slice. Feb 13 20:13:02.975808 kubelet[1946]: I0213 20:13:02.975658 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/995b5376-5e11-4222-b413-306ba18b9520-data\") pod \"nfs-server-provisioner-0\" (UID: \"995b5376-5e11-4222-b413-306ba18b9520\") " pod="default/nfs-server-provisioner-0" Feb 13 20:13:02.975808 kubelet[1946]: I0213 20:13:02.975762 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcj6s\" (UniqueName: \"kubernetes.io/projected/995b5376-5e11-4222-b413-306ba18b9520-kube-api-access-kcj6s\") pod \"nfs-server-provisioner-0\" (UID: \"995b5376-5e11-4222-b413-306ba18b9520\") " pod="default/nfs-server-provisioner-0" Feb 13 20:13:03.145283 containerd[1518]: time="2025-02-13T20:13:03.144445878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:995b5376-5e11-4222-b413-306ba18b9520,Namespace:default,Attempt:0,}" Feb 13 20:13:03.419303 systemd-networkd[1441]: cali60e51b789ff: Link UP Feb 13 20:13:03.422434 systemd-networkd[1441]: cali60e51b789ff: Gained carrier Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.286 [INFO][3743] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.244.13.70-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 995b5376-5e11-4222-b413-306ba18b9520 1315 0 2025-02-13 20:13:02 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 10.244.13.70 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.286 [INFO][3743] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.342 [INFO][3749] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" HandleID="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Workload="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.362 [INFO][3749] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" HandleID="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Workload="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318820), Attrs:map[string]string{"namespace":"default", "node":"10.244.13.70", "pod":"nfs-server-provisioner-0", "timestamp":"2025-02-13 20:13:03.342824063 +0000 UTC"}, Hostname:"10.244.13.70", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.362 [INFO][3749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.363 [INFO][3749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.363 [INFO][3749] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.244.13.70' Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.368 [INFO][3749] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.377 [INFO][3749] ipam/ipam.go 372: Looking up existing affinities for host host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.385 [INFO][3749] ipam/ipam.go 489: Trying affinity for 192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.387 [INFO][3749] ipam/ipam.go 155: Attempting to load block cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.391 [INFO][3749] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.391 [INFO][3749] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.56.128/26 handle="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.393 [INFO][3749] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.400 [INFO][3749] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.56.128/26 handle="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.410 [INFO][3749] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.56.131/26] block=192.168.56.128/26 handle="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.410 [INFO][3749] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.56.131/26] handle="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" host="10.244.13.70" Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.411 [INFO][3749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 20:13:03.439001 containerd[1518]: 2025-02-13 20:13:03.411 [INFO][3749] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.131/26] IPv6=[] ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" HandleID="k8s-pod-network.e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Workload="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.440987 containerd[1518]: 2025-02-13 20:13:03.413 [INFO][3743] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"995b5376-5e11-4222-b413-306ba18b9520", ResourceVersion:"1315", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 13, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.56.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:13:03.440987 containerd[1518]: 2025-02-13 20:13:03.413 [INFO][3743] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.56.131/32] ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.440987 containerd[1518]: 2025-02-13 20:13:03.413 [INFO][3743] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.440987 containerd[1518]: 2025-02-13 20:13:03.422 [INFO][3743] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.441301 containerd[1518]: 2025-02-13 20:13:03.423 [INFO][3743] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"995b5376-5e11-4222-b413-306ba18b9520", ResourceVersion:"1315", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 13, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.56.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"9a:a2:e5:e3:b6:8b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:13:03.441301 containerd[1518]: 2025-02-13 20:13:03.436 [INFO][3743] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.244.13.70-k8s-nfs--server--provisioner--0-eth0" Feb 13 20:13:03.477159 containerd[1518]: time="2025-02-13T20:13:03.476858578Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:13:03.477159 containerd[1518]: time="2025-02-13T20:13:03.476953075Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:13:03.477159 containerd[1518]: time="2025-02-13T20:13:03.476971395Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:13:03.477497 containerd[1518]: time="2025-02-13T20:13:03.477091746Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:13:03.507766 systemd[1]: Started cri-containerd-e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d.scope - libcontainer container e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d. Feb 13 20:13:03.538358 kubelet[1946]: E0213 20:13:03.538287 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:03.576214 containerd[1518]: time="2025-02-13T20:13:03.576154230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:995b5376-5e11-4222-b413-306ba18b9520,Namespace:default,Attempt:0,} returns sandbox id \"e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d\"" Feb 13 20:13:03.579249 containerd[1518]: time="2025-02-13T20:13:03.579212446Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Feb 13 20:13:04.539317 kubelet[1946]: E0213 20:13:04.539180 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:04.885918 systemd-networkd[1441]: cali60e51b789ff: Gained IPv6LL Feb 13 20:13:05.540342 kubelet[1946]: E0213 20:13:05.540212 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:06.541436 kubelet[1946]: E0213 20:13:06.541349 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:07.542028 kubelet[1946]: E0213 20:13:07.541912 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:08.218286 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3114022530.mount: Deactivated successfully. Feb 13 20:13:08.542384 kubelet[1946]: E0213 20:13:08.542284 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:09.543435 kubelet[1946]: E0213 20:13:09.543335 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:10.543777 kubelet[1946]: E0213 20:13:10.543706 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:11.319513 containerd[1518]: time="2025-02-13T20:13:11.319034319Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:13:11.324685 containerd[1518]: time="2025-02-13T20:13:11.323204944Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039414" Feb 13 20:13:11.346556 containerd[1518]: time="2025-02-13T20:13:11.346440771Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:13:11.353797 containerd[1518]: time="2025-02-13T20:13:11.353726831Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:13:11.355227 containerd[1518]: time="2025-02-13T20:13:11.355190160Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 7.775923057s" Feb 13 20:13:11.355611 containerd[1518]: time="2025-02-13T20:13:11.355415057Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Feb 13 20:13:11.361837 containerd[1518]: time="2025-02-13T20:13:11.361644642Z" level=info msg="CreateContainer within sandbox \"e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Feb 13 20:13:11.381662 containerd[1518]: time="2025-02-13T20:13:11.381595801Z" level=info msg="CreateContainer within sandbox \"e1bf547390cc29058ac4332319b670deffafb8448d9a758bc564d519aa0a647d\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c\"" Feb 13 20:13:11.382636 containerd[1518]: time="2025-02-13T20:13:11.382601063Z" level=info msg="StartContainer for \"f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c\"" Feb 13 20:13:11.436082 systemd[1]: run-containerd-runc-k8s.io-f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c-runc.WXTtqa.mount: Deactivated successfully. Feb 13 20:13:11.453768 systemd[1]: Started cri-containerd-f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c.scope - libcontainer container f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c. Feb 13 20:13:11.508254 containerd[1518]: time="2025-02-13T20:13:11.507981863Z" level=info msg="StartContainer for \"f01536c13e35cff9902428e6f983419de64f178dbc26132f5e2f955ee938cc0c\" returns successfully" Feb 13 20:13:11.544310 kubelet[1946]: E0213 20:13:11.544230 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:12.147391 kubelet[1946]: I0213 20:13:12.147097 1946 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=2.367362101 podStartE2EDuration="10.147060339s" podCreationTimestamp="2025-02-13 20:13:02 +0000 UTC" firstStartedPulling="2025-02-13 20:13:03.578304949 +0000 UTC m=+49.302058797" lastFinishedPulling="2025-02-13 20:13:11.358003187 +0000 UTC m=+57.081757035" observedRunningTime="2025-02-13 20:13:12.146342772 +0000 UTC m=+57.870096648" watchObservedRunningTime="2025-02-13 20:13:12.147060339 +0000 UTC m=+57.870814198" Feb 13 20:13:12.544914 kubelet[1946]: E0213 20:13:12.544815 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:13.546100 kubelet[1946]: E0213 20:13:13.546007 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:14.546403 kubelet[1946]: E0213 20:13:14.546295 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:15.492692 kubelet[1946]: E0213 20:13:15.492626 1946 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:15.540637 containerd[1518]: time="2025-02-13T20:13:15.540550918Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:13:15.541279 containerd[1518]: time="2025-02-13T20:13:15.540736170Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:13:15.541279 containerd[1518]: time="2025-02-13T20:13:15.540756844Z" level=info msg="StopPodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:13:15.546472 kubelet[1946]: E0213 20:13:15.546433 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:15.546959 containerd[1518]: time="2025-02-13T20:13:15.546651416Z" level=info msg="RemovePodSandbox for \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:13:15.555243 containerd[1518]: time="2025-02-13T20:13:15.555071115Z" level=info msg="Forcibly stopping sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\"" Feb 13 20:13:15.555350 containerd[1518]: time="2025-02-13T20:13:15.555309473Z" level=info msg="TearDown network for sandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" successfully" Feb 13 20:13:15.572227 containerd[1518]: time="2025-02-13T20:13:15.572170002Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.572348 containerd[1518]: time="2025-02-13T20:13:15.572259707Z" level=info msg="RemovePodSandbox \"03002e04ff042853dd28495960a4d939f0b67aeaf1292151ab37a1ef4c2a5857\" returns successfully" Feb 13 20:13:15.572989 containerd[1518]: time="2025-02-13T20:13:15.572938696Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:13:15.573153 containerd[1518]: time="2025-02-13T20:13:15.573082243Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:13:15.573153 containerd[1518]: time="2025-02-13T20:13:15.573107363Z" level=info msg="StopPodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:13:15.574786 containerd[1518]: time="2025-02-13T20:13:15.573528139Z" level=info msg="RemovePodSandbox for \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:13:15.574786 containerd[1518]: time="2025-02-13T20:13:15.573560729Z" level=info msg="Forcibly stopping sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\"" Feb 13 20:13:15.574786 containerd[1518]: time="2025-02-13T20:13:15.573664760Z" level=info msg="TearDown network for sandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" successfully" Feb 13 20:13:15.576399 containerd[1518]: time="2025-02-13T20:13:15.576366886Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.576597 containerd[1518]: time="2025-02-13T20:13:15.576562799Z" level=info msg="RemovePodSandbox \"b87a6cc062208842f8dcdc9a3e7210e7ab5a99f176627c119b6dc17f72fbe533\" returns successfully" Feb 13 20:13:15.577081 containerd[1518]: time="2025-02-13T20:13:15.577050374Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:13:15.577387 containerd[1518]: time="2025-02-13T20:13:15.577357239Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:13:15.577527 containerd[1518]: time="2025-02-13T20:13:15.577463542Z" level=info msg="StopPodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:13:15.578195 containerd[1518]: time="2025-02-13T20:13:15.578157305Z" level=info msg="RemovePodSandbox for \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:13:15.578373 containerd[1518]: time="2025-02-13T20:13:15.578346790Z" level=info msg="Forcibly stopping sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\"" Feb 13 20:13:15.578661 containerd[1518]: time="2025-02-13T20:13:15.578585987Z" level=info msg="TearDown network for sandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" successfully" Feb 13 20:13:15.581509 containerd[1518]: time="2025-02-13T20:13:15.581286259Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.581509 containerd[1518]: time="2025-02-13T20:13:15.581330937Z" level=info msg="RemovePodSandbox \"b24f42b29221bb4a800d6d9477c6949ef70b05d918955623f06869649cd531f5\" returns successfully" Feb 13 20:13:15.582443 containerd[1518]: time="2025-02-13T20:13:15.582198765Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:13:15.582443 containerd[1518]: time="2025-02-13T20:13:15.582330878Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:13:15.582443 containerd[1518]: time="2025-02-13T20:13:15.582349970Z" level=info msg="StopPodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:13:15.584082 containerd[1518]: time="2025-02-13T20:13:15.582951056Z" level=info msg="RemovePodSandbox for \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:13:15.584082 containerd[1518]: time="2025-02-13T20:13:15.582984820Z" level=info msg="Forcibly stopping sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\"" Feb 13 20:13:15.584082 containerd[1518]: time="2025-02-13T20:13:15.583083882Z" level=info msg="TearDown network for sandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" successfully" Feb 13 20:13:15.588171 containerd[1518]: time="2025-02-13T20:13:15.588134892Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.588281 containerd[1518]: time="2025-02-13T20:13:15.588183912Z" level=info msg="RemovePodSandbox \"bb4dba08d01bdd05ef9eca5647652c984512d6cb6f32349b3907d64ba5c83325\" returns successfully" Feb 13 20:13:15.588788 containerd[1518]: time="2025-02-13T20:13:15.588757059Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:13:15.589163 containerd[1518]: time="2025-02-13T20:13:15.589136067Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:13:15.589306 containerd[1518]: time="2025-02-13T20:13:15.589281053Z" level=info msg="StopPodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:13:15.589797 containerd[1518]: time="2025-02-13T20:13:15.589767397Z" level=info msg="RemovePodSandbox for \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:13:15.590066 containerd[1518]: time="2025-02-13T20:13:15.590040452Z" level=info msg="Forcibly stopping sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\"" Feb 13 20:13:15.590340 containerd[1518]: time="2025-02-13T20:13:15.590295130Z" level=info msg="TearDown network for sandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" successfully" Feb 13 20:13:15.592692 containerd[1518]: time="2025-02-13T20:13:15.592658653Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.592847 containerd[1518]: time="2025-02-13T20:13:15.592821014Z" level=info msg="RemovePodSandbox \"3dffb31175067e1f23df14e757289a4d3591b58d0f53cf308e92401102aa9620\" returns successfully" Feb 13 20:13:15.593423 containerd[1518]: time="2025-02-13T20:13:15.593391436Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:13:15.593593 containerd[1518]: time="2025-02-13T20:13:15.593548935Z" level=info msg="TearDown network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" successfully" Feb 13 20:13:15.593593 containerd[1518]: time="2025-02-13T20:13:15.593576287Z" level=info msg="StopPodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" returns successfully" Feb 13 20:13:15.594110 containerd[1518]: time="2025-02-13T20:13:15.593920937Z" level=info msg="RemovePodSandbox for \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:13:15.594110 containerd[1518]: time="2025-02-13T20:13:15.594067652Z" level=info msg="Forcibly stopping sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\"" Feb 13 20:13:15.594293 containerd[1518]: time="2025-02-13T20:13:15.594194553Z" level=info msg="TearDown network for sandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" successfully" Feb 13 20:13:15.599099 containerd[1518]: time="2025-02-13T20:13:15.598753285Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.599099 containerd[1518]: time="2025-02-13T20:13:15.598819960Z" level=info msg="RemovePodSandbox \"b34ed3efc8d913b58090ba5a0f547c54a2a416d258c0668e42d18746e03dba4c\" returns successfully" Feb 13 20:13:15.600567 containerd[1518]: time="2025-02-13T20:13:15.600477134Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" Feb 13 20:13:15.600694 containerd[1518]: time="2025-02-13T20:13:15.600633718Z" level=info msg="TearDown network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" successfully" Feb 13 20:13:15.600694 containerd[1518]: time="2025-02-13T20:13:15.600657060Z" level=info msg="StopPodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" returns successfully" Feb 13 20:13:15.603844 containerd[1518]: time="2025-02-13T20:13:15.603812069Z" level=info msg="RemovePodSandbox for \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" Feb 13 20:13:15.604006 containerd[1518]: time="2025-02-13T20:13:15.603978936Z" level=info msg="Forcibly stopping sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\"" Feb 13 20:13:15.604289 containerd[1518]: time="2025-02-13T20:13:15.604231835Z" level=info msg="TearDown network for sandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" successfully" Feb 13 20:13:15.614464 containerd[1518]: time="2025-02-13T20:13:15.614426652Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.614642 containerd[1518]: time="2025-02-13T20:13:15.614614669Z" level=info msg="RemovePodSandbox \"655c02bfd55875a4ac9e2150dad4d2263be4d6f598c53ec93ebcaccbb5e6016d\" returns successfully" Feb 13 20:13:15.615330 containerd[1518]: time="2025-02-13T20:13:15.615280718Z" level=info msg="StopPodSandbox for \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\"" Feb 13 20:13:15.615501 containerd[1518]: time="2025-02-13T20:13:15.615416946Z" level=info msg="TearDown network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" successfully" Feb 13 20:13:15.615594 containerd[1518]: time="2025-02-13T20:13:15.615518656Z" level=info msg="StopPodSandbox for \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" returns successfully" Feb 13 20:13:15.616237 containerd[1518]: time="2025-02-13T20:13:15.616183806Z" level=info msg="RemovePodSandbox for \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\"" Feb 13 20:13:15.616295 containerd[1518]: time="2025-02-13T20:13:15.616235413Z" level=info msg="Forcibly stopping sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\"" Feb 13 20:13:15.616391 containerd[1518]: time="2025-02-13T20:13:15.616360397Z" level=info msg="TearDown network for sandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" successfully" Feb 13 20:13:15.619296 containerd[1518]: time="2025-02-13T20:13:15.619250066Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.619405 containerd[1518]: time="2025-02-13T20:13:15.619302799Z" level=info msg="RemovePodSandbox \"c7689b45a8b4229367096183fa0c35837b764736d6d7ba1c1c87356bc59abbb8\" returns successfully" Feb 13 20:13:15.619819 containerd[1518]: time="2025-02-13T20:13:15.619691871Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:13:15.620080 containerd[1518]: time="2025-02-13T20:13:15.619958139Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:13:15.620080 containerd[1518]: time="2025-02-13T20:13:15.619995673Z" level=info msg="StopPodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:13:15.620515 containerd[1518]: time="2025-02-13T20:13:15.620418598Z" level=info msg="RemovePodSandbox for \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:13:15.620515 containerd[1518]: time="2025-02-13T20:13:15.620468729Z" level=info msg="Forcibly stopping sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\"" Feb 13 20:13:15.620652 containerd[1518]: time="2025-02-13T20:13:15.620598326Z" level=info msg="TearDown network for sandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" successfully" Feb 13 20:13:15.623217 containerd[1518]: time="2025-02-13T20:13:15.623115338Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.623217 containerd[1518]: time="2025-02-13T20:13:15.623166875Z" level=info msg="RemovePodSandbox \"87f919eac623aff44ea64cbda78bf2da2c599c88312a1b4cf6844752b45d867a\" returns successfully" Feb 13 20:13:15.623989 containerd[1518]: time="2025-02-13T20:13:15.623749102Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:13:15.623989 containerd[1518]: time="2025-02-13T20:13:15.623862375Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:13:15.623989 containerd[1518]: time="2025-02-13T20:13:15.623881761Z" level=info msg="StopPodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:13:15.624880 containerd[1518]: time="2025-02-13T20:13:15.624662431Z" level=info msg="RemovePodSandbox for \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:13:15.624880 containerd[1518]: time="2025-02-13T20:13:15.624696113Z" level=info msg="Forcibly stopping sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\"" Feb 13 20:13:15.624880 containerd[1518]: time="2025-02-13T20:13:15.624802318Z" level=info msg="TearDown network for sandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" successfully" Feb 13 20:13:15.627430 containerd[1518]: time="2025-02-13T20:13:15.627323982Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.627430 containerd[1518]: time="2025-02-13T20:13:15.627372351Z" level=info msg="RemovePodSandbox \"29f209a2ac3607d628cb465d86d5f89d8b475822fda6119c5050e43a048f626e\" returns successfully" Feb 13 20:13:15.627949 containerd[1518]: time="2025-02-13T20:13:15.627871727Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:13:15.628047 containerd[1518]: time="2025-02-13T20:13:15.628021181Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:13:15.628117 containerd[1518]: time="2025-02-13T20:13:15.628046701Z" level=info msg="StopPodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:13:15.629722 containerd[1518]: time="2025-02-13T20:13:15.628520570Z" level=info msg="RemovePodSandbox for \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:13:15.629722 containerd[1518]: time="2025-02-13T20:13:15.628558872Z" level=info msg="Forcibly stopping sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\"" Feb 13 20:13:15.629722 containerd[1518]: time="2025-02-13T20:13:15.628671848Z" level=info msg="TearDown network for sandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" successfully" Feb 13 20:13:15.631223 containerd[1518]: time="2025-02-13T20:13:15.631179445Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.631381 containerd[1518]: time="2025-02-13T20:13:15.631342959Z" level=info msg="RemovePodSandbox \"bc1dff50a7f9b9fc1809aca500992ae061c6cdccd217ba02e66a9ea7994caa8c\" returns successfully" Feb 13 20:13:15.632038 containerd[1518]: time="2025-02-13T20:13:15.632008690Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:13:15.632185 containerd[1518]: time="2025-02-13T20:13:15.632158626Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:13:15.632364 containerd[1518]: time="2025-02-13T20:13:15.632185473Z" level=info msg="StopPodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:13:15.632581 containerd[1518]: time="2025-02-13T20:13:15.632551171Z" level=info msg="RemovePodSandbox for \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:13:15.632652 containerd[1518]: time="2025-02-13T20:13:15.632587124Z" level=info msg="Forcibly stopping sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\"" Feb 13 20:13:15.632715 containerd[1518]: time="2025-02-13T20:13:15.632669611Z" level=info msg="TearDown network for sandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" successfully" Feb 13 20:13:15.635085 containerd[1518]: time="2025-02-13T20:13:15.634999882Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.635085 containerd[1518]: time="2025-02-13T20:13:15.635055118Z" level=info msg="RemovePodSandbox \"dd30c3d224bfc54317a5cfd0640e7d4bba514b111a4aca8f784d4c07120b7d39\" returns successfully" Feb 13 20:13:15.635855 containerd[1518]: time="2025-02-13T20:13:15.635533118Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:13:15.635855 containerd[1518]: time="2025-02-13T20:13:15.635645805Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:13:15.635855 containerd[1518]: time="2025-02-13T20:13:15.635664227Z" level=info msg="StopPodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:13:15.636475 containerd[1518]: time="2025-02-13T20:13:15.636443196Z" level=info msg="RemovePodSandbox for \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:13:15.636621 containerd[1518]: time="2025-02-13T20:13:15.636493426Z" level=info msg="Forcibly stopping sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\"" Feb 13 20:13:15.636733 containerd[1518]: time="2025-02-13T20:13:15.636690465Z" level=info msg="TearDown network for sandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" successfully" Feb 13 20:13:15.639409 containerd[1518]: time="2025-02-13T20:13:15.639371787Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.639659 containerd[1518]: time="2025-02-13T20:13:15.639420820Z" level=info msg="RemovePodSandbox \"361b89882ada5698efac812bba2eaba8e03706fc23695c8b2fb041d5ee7c276e\" returns successfully" Feb 13 20:13:15.642066 containerd[1518]: time="2025-02-13T20:13:15.641349417Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:13:15.642066 containerd[1518]: time="2025-02-13T20:13:15.641458882Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:13:15.642066 containerd[1518]: time="2025-02-13T20:13:15.641477906Z" level=info msg="StopPodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:13:15.643306 containerd[1518]: time="2025-02-13T20:13:15.643273685Z" level=info msg="RemovePodSandbox for \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:13:15.643518 containerd[1518]: time="2025-02-13T20:13:15.643458697Z" level=info msg="Forcibly stopping sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\"" Feb 13 20:13:15.644118 containerd[1518]: time="2025-02-13T20:13:15.644007224Z" level=info msg="TearDown network for sandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" successfully" Feb 13 20:13:15.657889 containerd[1518]: time="2025-02-13T20:13:15.657469011Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.657889 containerd[1518]: time="2025-02-13T20:13:15.657558169Z" level=info msg="RemovePodSandbox \"b417548c6501726320e9a9ed43f2e9a3052d83af92d00d1bbd837413741d107f\" returns successfully" Feb 13 20:13:15.658399 containerd[1518]: time="2025-02-13T20:13:15.658366858Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:13:15.658689 containerd[1518]: time="2025-02-13T20:13:15.658656749Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:13:15.658833 containerd[1518]: time="2025-02-13T20:13:15.658806213Z" level=info msg="StopPodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:13:15.660506 containerd[1518]: time="2025-02-13T20:13:15.659710591Z" level=info msg="RemovePodSandbox for \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:13:15.660506 containerd[1518]: time="2025-02-13T20:13:15.659745527Z" level=info msg="Forcibly stopping sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\"" Feb 13 20:13:15.660506 containerd[1518]: time="2025-02-13T20:13:15.659849640Z" level=info msg="TearDown network for sandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" successfully" Feb 13 20:13:15.678596 containerd[1518]: time="2025-02-13T20:13:15.678548381Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.678820 containerd[1518]: time="2025-02-13T20:13:15.678793170Z" level=info msg="RemovePodSandbox \"55c6ab75e2c065239b5d61c3d694be766c5fd6a055af4629a138951efada1a68\" returns successfully" Feb 13 20:13:15.679985 containerd[1518]: time="2025-02-13T20:13:15.679910812Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:13:15.680252 containerd[1518]: time="2025-02-13T20:13:15.680223512Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:13:15.680354 containerd[1518]: time="2025-02-13T20:13:15.680330787Z" level=info msg="StopPodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:13:15.680865 containerd[1518]: time="2025-02-13T20:13:15.680835641Z" level=info msg="RemovePodSandbox for \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:13:15.681016 containerd[1518]: time="2025-02-13T20:13:15.680980049Z" level=info msg="Forcibly stopping sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\"" Feb 13 20:13:15.681260 containerd[1518]: time="2025-02-13T20:13:15.681170105Z" level=info msg="TearDown network for sandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" successfully" Feb 13 20:13:15.687795 containerd[1518]: time="2025-02-13T20:13:15.687740219Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.688074 containerd[1518]: time="2025-02-13T20:13:15.688045424Z" level=info msg="RemovePodSandbox \"62d4bf8b3b41bf12d3ad5f9ffc4fd79f9aad2c80e0ba7015bee9f20442c3f570\" returns successfully" Feb 13 20:13:15.688546 containerd[1518]: time="2025-02-13T20:13:15.688516751Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:13:15.688767 containerd[1518]: time="2025-02-13T20:13:15.688740683Z" level=info msg="TearDown network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" successfully" Feb 13 20:13:15.688885 containerd[1518]: time="2025-02-13T20:13:15.688861378Z" level=info msg="StopPodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" returns successfully" Feb 13 20:13:15.689850 containerd[1518]: time="2025-02-13T20:13:15.689819391Z" level=info msg="RemovePodSandbox for \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:13:15.690004 containerd[1518]: time="2025-02-13T20:13:15.689977074Z" level=info msg="Forcibly stopping sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\"" Feb 13 20:13:15.690235 containerd[1518]: time="2025-02-13T20:13:15.690170984Z" level=info msg="TearDown network for sandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" successfully" Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.713469199Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.713579589Z" level=info msg="RemovePodSandbox \"853934624e5a330c0a160770645b32cef9ee766ace2723f0ff2d74d4d7bb905e\" returns successfully" Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.714990326Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.715175116Z" level=info msg="TearDown network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" successfully" Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.715250940Z" level=info msg="StopPodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" returns successfully" Feb 13 20:13:15.716515 containerd[1518]: time="2025-02-13T20:13:15.716371413Z" level=info msg="RemovePodSandbox for \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" Feb 13 20:13:15.716949 containerd[1518]: time="2025-02-13T20:13:15.716537133Z" level=info msg="Forcibly stopping sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\"" Feb 13 20:13:15.716949 containerd[1518]: time="2025-02-13T20:13:15.716654782Z" level=info msg="TearDown network for sandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" successfully" Feb 13 20:13:15.722506 containerd[1518]: time="2025-02-13T20:13:15.722439307Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.722599 containerd[1518]: time="2025-02-13T20:13:15.722508501Z" level=info msg="RemovePodSandbox \"421cabdea78f135c36db935aa3cd65e4bc6b25a6938946c185178ce4d8bd6f5b\" returns successfully" Feb 13 20:13:15.723655 containerd[1518]: time="2025-02-13T20:13:15.723166345Z" level=info msg="StopPodSandbox for \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\"" Feb 13 20:13:15.723655 containerd[1518]: time="2025-02-13T20:13:15.723316457Z" level=info msg="TearDown network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" successfully" Feb 13 20:13:15.723655 containerd[1518]: time="2025-02-13T20:13:15.723337663Z" level=info msg="StopPodSandbox for \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" returns successfully" Feb 13 20:13:15.725245 containerd[1518]: time="2025-02-13T20:13:15.724255394Z" level=info msg="RemovePodSandbox for \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\"" Feb 13 20:13:15.725245 containerd[1518]: time="2025-02-13T20:13:15.724318646Z" level=info msg="Forcibly stopping sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\"" Feb 13 20:13:15.725245 containerd[1518]: time="2025-02-13T20:13:15.724416844Z" level=info msg="TearDown network for sandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" successfully" Feb 13 20:13:15.727234 containerd[1518]: time="2025-02-13T20:13:15.727184319Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 20:13:15.727411 containerd[1518]: time="2025-02-13T20:13:15.727381354Z" level=info msg="RemovePodSandbox \"9002ae3b981d25c0567feaddc7ef9260a8f2a1313b224392814c7eaa9a9d8460\" returns successfully" Feb 13 20:13:16.547268 kubelet[1946]: E0213 20:13:16.547151 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:17.547801 kubelet[1946]: E0213 20:13:17.547717 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:18.548667 kubelet[1946]: E0213 20:13:18.548583 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:19.549880 kubelet[1946]: E0213 20:13:19.549803 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:20.550559 kubelet[1946]: E0213 20:13:20.550414 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:20.879358 systemd[1]: Created slice kubepods-besteffort-podb8dafa96_2937_446a_a407_8b5f05e88b4c.slice - libcontainer container kubepods-besteffort-podb8dafa96_2937_446a_a407_8b5f05e88b4c.slice. Feb 13 20:13:21.000682 kubelet[1946]: I0213 20:13:21.000505 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-538c2c54-750b-40d2-906c-892c58ce0964\" (UniqueName: \"kubernetes.io/nfs/b8dafa96-2937-446a-a407-8b5f05e88b4c-pvc-538c2c54-750b-40d2-906c-892c58ce0964\") pod \"test-pod-1\" (UID: \"b8dafa96-2937-446a-a407-8b5f05e88b4c\") " pod="default/test-pod-1" Feb 13 20:13:21.000682 kubelet[1946]: I0213 20:13:21.000587 1946 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpp59\" (UniqueName: \"kubernetes.io/projected/b8dafa96-2937-446a-a407-8b5f05e88b4c-kube-api-access-lpp59\") pod \"test-pod-1\" (UID: \"b8dafa96-2937-446a-a407-8b5f05e88b4c\") " pod="default/test-pod-1" Feb 13 20:13:21.147725 kernel: FS-Cache: Loaded Feb 13 20:13:21.227891 kernel: RPC: Registered named UNIX socket transport module. Feb 13 20:13:21.228104 kernel: RPC: Registered udp transport module. Feb 13 20:13:21.228171 kernel: RPC: Registered tcp transport module. Feb 13 20:13:21.229017 kernel: RPC: Registered tcp-with-tls transport module. Feb 13 20:13:21.230132 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 13 20:13:21.511808 kernel: NFS: Registering the id_resolver key type Feb 13 20:13:21.511980 kernel: Key type id_resolver registered Feb 13 20:13:21.512821 kernel: Key type id_legacy registered Feb 13 20:13:21.551050 kubelet[1946]: E0213 20:13:21.550977 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:21.570939 nfsidmap[3959]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'gb1.brightbox.com' Feb 13 20:13:21.579940 nfsidmap[3962]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'gb1.brightbox.com' Feb 13 20:13:21.784195 containerd[1518]: time="2025-02-13T20:13:21.783761164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:b8dafa96-2937-446a-a407-8b5f05e88b4c,Namespace:default,Attempt:0,}" Feb 13 20:13:21.982405 systemd-networkd[1441]: cali5ec59c6bf6e: Link UP Feb 13 20:13:21.983842 systemd-networkd[1441]: cali5ec59c6bf6e: Gained carrier Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.861 [INFO][3967] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.244.13.70-k8s-test--pod--1-eth0 default b8dafa96-2937-446a-a407-8b5f05e88b4c 1380 0 2025-02-13 20:13:05 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.244.13.70 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.862 [INFO][3967] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.907 [INFO][3977] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" HandleID="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Workload="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.929 [INFO][3977] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" HandleID="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Workload="10.244.13.70-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000292ed0), Attrs:map[string]string{"namespace":"default", "node":"10.244.13.70", "pod":"test-pod-1", "timestamp":"2025-02-13 20:13:21.907072987 +0000 UTC"}, Hostname:"10.244.13.70", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.930 [INFO][3977] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.930 [INFO][3977] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.930 [INFO][3977] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.244.13.70' Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.933 [INFO][3977] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.940 [INFO][3977] ipam/ipam.go 372: Looking up existing affinities for host host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.948 [INFO][3977] ipam/ipam.go 489: Trying affinity for 192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.952 [INFO][3977] ipam/ipam.go 155: Attempting to load block cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.956 [INFO][3977] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.56.128/26 host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.957 [INFO][3977] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.56.128/26 handle="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.959 [INFO][3977] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6 Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.966 [INFO][3977] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.56.128/26 handle="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.972 [INFO][3977] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.56.132/26] block=192.168.56.128/26 handle="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.972 [INFO][3977] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.56.132/26] handle="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" host="10.244.13.70" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.972 [INFO][3977] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.972 [INFO][3977] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.132/26] IPv6=[] ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" HandleID="k8s-pod-network.589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Workload="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.002314 containerd[1518]: 2025-02-13 20:13:21.974 [INFO][3967] cni-plugin/k8s.go 386: Populated endpoint ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"b8dafa96-2937-446a-a407-8b5f05e88b4c", ResourceVersion:"1380", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 13, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.56.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:13:22.013190 containerd[1518]: 2025-02-13 20:13:21.974 [INFO][3967] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.56.132/32] ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.013190 containerd[1518]: 2025-02-13 20:13:21.975 [INFO][3967] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.013190 containerd[1518]: 2025-02-13 20:13:21.981 [INFO][3967] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.013190 containerd[1518]: 2025-02-13 20:13:21.982 [INFO][3967] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.244.13.70-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"b8dafa96-2937-446a-a407-8b5f05e88b4c", ResourceVersion:"1380", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 20, 13, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.244.13.70", ContainerID:"589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.56.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"fa:f8:f1:82:4a:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 20:13:22.013190 containerd[1518]: 2025-02-13 20:13:21.995 [INFO][3967] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.244.13.70-k8s-test--pod--1-eth0" Feb 13 20:13:22.043907 containerd[1518]: time="2025-02-13T20:13:22.042678671Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 20:13:22.043907 containerd[1518]: time="2025-02-13T20:13:22.042796687Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 20:13:22.043907 containerd[1518]: time="2025-02-13T20:13:22.042863077Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:13:22.043907 containerd[1518]: time="2025-02-13T20:13:22.042981101Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 20:13:22.073796 systemd[1]: Started cri-containerd-589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6.scope - libcontainer container 589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6. Feb 13 20:13:22.145782 containerd[1518]: time="2025-02-13T20:13:22.145713119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:b8dafa96-2937-446a-a407-8b5f05e88b4c,Namespace:default,Attempt:0,} returns sandbox id \"589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6\"" Feb 13 20:13:22.148156 containerd[1518]: time="2025-02-13T20:13:22.148042199Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 20:13:22.551808 kubelet[1946]: E0213 20:13:22.551726 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:22.672546 containerd[1518]: time="2025-02-13T20:13:22.671398439Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Feb 13 20:13:22.674259 containerd[1518]: time="2025-02-13T20:13:22.674211877Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 20:13:22.675949 containerd[1518]: time="2025-02-13T20:13:22.675899006Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 527.813343ms" Feb 13 20:13:22.676062 containerd[1518]: time="2025-02-13T20:13:22.675947924Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 20:13:22.679408 containerd[1518]: time="2025-02-13T20:13:22.679364872Z" level=info msg="CreateContainer within sandbox \"589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6\" for container &ContainerMetadata{Name:test,Attempt:0,}" Feb 13 20:13:22.698575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1799234722.mount: Deactivated successfully. Feb 13 20:13:22.700152 containerd[1518]: time="2025-02-13T20:13:22.700057148Z" level=info msg="CreateContainer within sandbox \"589a9c6bd6ffd5f29699bd86b3354eb5f597d33874a0c38317e026429aef51e6\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc\"" Feb 13 20:13:22.701315 containerd[1518]: time="2025-02-13T20:13:22.701260524Z" level=info msg="StartContainer for \"6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc\"" Feb 13 20:13:22.754819 systemd[1]: Started cri-containerd-6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc.scope - libcontainer container 6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc. Feb 13 20:13:22.797296 containerd[1518]: time="2025-02-13T20:13:22.797212214Z" level=info msg="StartContainer for \"6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc\" returns successfully" Feb 13 20:13:23.120756 systemd[1]: run-containerd-runc-k8s.io-6dc296223c35efaada44ac2c839945a28e28db843c725a588363a3fd5fcd81dc-runc.ssQ0Gj.mount: Deactivated successfully. Feb 13 20:13:23.552267 kubelet[1946]: E0213 20:13:23.552178 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:23.701780 systemd-networkd[1441]: cali5ec59c6bf6e: Gained IPv6LL Feb 13 20:13:24.553470 kubelet[1946]: E0213 20:13:24.553386 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:25.553873 kubelet[1946]: E0213 20:13:25.553788 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:26.554626 kubelet[1946]: E0213 20:13:26.554554 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:27.554873 kubelet[1946]: E0213 20:13:27.554720 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:28.126303 systemd[1]: Started sshd@10-10.244.13.70:22-137.184.188.240:43920.service - OpenSSH per-connection server daemon (137.184.188.240:43920). Feb 13 20:13:28.555179 kubelet[1946]: E0213 20:13:28.555075 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:28.926862 sshd[4100]: Invalid user lfs from 137.184.188.240 port 43920 Feb 13 20:13:29.074774 sshd[4100]: Received disconnect from 137.184.188.240 port 43920:11: Bye Bye [preauth] Feb 13 20:13:29.074774 sshd[4100]: Disconnected from invalid user lfs 137.184.188.240 port 43920 [preauth] Feb 13 20:13:29.077217 systemd[1]: sshd@10-10.244.13.70:22-137.184.188.240:43920.service: Deactivated successfully. Feb 13 20:13:29.556373 kubelet[1946]: E0213 20:13:29.556281 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:30.557621 kubelet[1946]: E0213 20:13:30.557513 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 20:13:31.557818 kubelet[1946]: E0213 20:13:31.557751 1946 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"